You are viewing a plain text version of this content. The canonical link for it is here.
Posted to hdfs-dev@hadoop.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2013/10/10 15:33:07 UTC

Build failed in Jenkins: Hadoop-Hdfs-trunk #1548

See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1548/changes>

Changes:

[devaraj] YARN-879. Fixed tests w.r.t o.a.h.y.server.resourcemanager.Application. Contributed by Junping Du.

[brandonli] HDFS-5337. should do hsync for a commit request even there is no pending writes. Contributed by Brandon Li

[cnauroth] HADOOP-10031. FsShell -get/copyToLocal/moveFromLocal should support Windows local path. Contributed by Chuan Liu.

[vinodkv] YARN-1283. Fixed RM to give a fully-qualified proxy URL for an application so that clients don't need to do scheme-mangling. Contributed by Omkar Vinit Joshi.

[jlowe] MAPREDUCE-5102. fix coverage org.apache.hadoop.mapreduce.lib.db and org.apache.hadoop.mapred.lib.db. Contributed by Aleksey Gorshkov, Andrey Klochkov, and Nathan Roberts

[cmccabe] HDFS-5323.  Remove some deadcode in BlockManager (Colin Patrick McCabe)

[tucu] Amending yarn CHANGES.txt moving YARN-1284 to 2.2.1

[jlowe] MAPREDUCE-5569. FloatSplitter is not generating correct splits. Contributed by Nathan Roberts

[kihwal] HDFS-4510. Cover classes ClusterJspHelper/NamenodeJspHelper with unit tests. Contributed by Andrey Klochkov.

[daryn] HADOOP-9470. eliminate duplicate FQN tests in different Hadoop modules (Ivan A. Veselovsky via daryn)

------------------------------------------
[...truncated 12441 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.072 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.386 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.994 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.121 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.604 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.088 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.13 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.131 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.039 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.198 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.438 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.317 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.555 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.24 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 108.195 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.024 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.785 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.027 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.21 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.288 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.341 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.16 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.212 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.159 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.884 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.754 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.884 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.363 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.195 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.325 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.227 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.158 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.367 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.208 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.903 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.341 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.35 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.542 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.468 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.463 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.13 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.672 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.85 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.291 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.146 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.074 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.131 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.067 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.808 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.443 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.579 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.758 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.998 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.846 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » Remote
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » Remote
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory:365->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingFile:375->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingDirectory:385->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory:407->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingFile:430->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory:439->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testInputStreamClosedTwice:461->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOutputStreamClosedTwice:473 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverWriteAndRead:509->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testFilesystemIsCaseSensitive:535 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testZeroByteFilesAreFiles:562 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMultiByteFilesAreFiles:575 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameChildDirForbidden:614->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirToSelf:647->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileToSelf:679->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMoveFileUnderParent:693->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testLSRootDir:703->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testListStatusRootDir:710->FileSystemContractBaseTest.createFile:484 » IO

Tests run: 2125, Failures: 0, Errors: 27, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:57:47.187s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.810s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:57:49.836s
[INFO] Finished at: Thu Oct 10 13:32:22 UTC 2013
[INFO] Final Memory: 46M/361M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating MAPREDUCE-5569
Updating HDFS-5323
Updating HADOOP-9470
Updating HDFS-4510
Updating YARN-1284
Updating YARN-1283
Updating YARN-879
Updating HADOOP-10031
Updating MAPREDUCE-5102
Updating HDFS-5337

Hadoop-Hdfs-trunk - Build # 1550 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1550/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11594 lines...]

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:00.623s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.057s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:03.529s
[INFO] Finished at: Sat Oct 12 13:19:44 UTC 2013
[INFO] Final Memory: 31M/296M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HADOOP-10040
Updating HDFS-5322
Updating YARN-1044
Updating HDFS-5267
Updating YARN-1300
Updating HDFS-5276
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 1551 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1551/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11591 lines...]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:15.686s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.127s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:18.659s
[INFO] Finished at: Sun Oct 13 13:19:57 UTC 2013
[INFO] Final Memory: 39M/344M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HDFS-5329
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 1552 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1552/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11618 lines...]
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:29.257s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.800s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:31.893s
[INFO] Finished at: Mon Oct 14 13:19:14 UTC 2013
[INFO] Final Memory: 34M/402M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating MAPREDUCE-5329
Updating MAPREDUCE-5463
Updating YARN-305
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 1553 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1553/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11599 lines...]
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:23.362s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [4.022s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:28.230s
[INFO] Finished at: Tue Oct 15 13:19:16 UTC 2013
[INFO] Final Memory: 40M/373M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HADOOP-10040
Updating HADOOP-9494
Updating HDFS-5352
Updating MAPREDUCE-5518
Updating YARN-1259
Updating HDFS-5342
Updating YARN-1182
Updating MAPREDUCE-5546
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
2 tests failed.
FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:36872 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:36872 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteHalfABlock

Error Message:
Read timed out

Stack Trace:
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(SocketInputStream.java:129)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1195)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:331)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Hdfs-trunk - Build # 1555 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1555/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12719 lines...]
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:20.427s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.907s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:25.192s
[INFO] Finished at: Thu Oct 17 13:19:07 UTC 2013
[INFO] Final Memory: 31M/322M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5283
Updating MAPREDUCE-5586
Updating HDFS-5360
Updating MAPREDUCE-5585
Updating HADOOP-9897
Updating HADOOP-10005
Updating HDFS-5370
Updating HADOOP-9078
Updating HDFS-4376
Updating HDFS-5334
Updating HDFS-5346
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
26 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingFile

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingFile(FileSystemContractBaseTest.java:375)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingFile(FileSystemContractBaseTest.java:375)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingDirectory

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingDirectory(FileSystemContractBaseTest.java:385)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryMoveToExistingDirectory

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:407)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingFile

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingFile(FileSystemContractBaseTest.java:430)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingFile(FileSystemContractBaseTest.java:430)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingDirectory

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory(FileSystemContractBaseTest.java:439)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testInputStreamClosedTwice

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testInputStreamClosedTwice(FileSystemContractBaseTest.java:461)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testInputStreamClosedTwice(FileSystemContractBaseTest.java:461)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOutputStreamClosedTwice

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverWriteAndRead

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverWriteAndRead(FileSystemContractBaseTest.java:509)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testFilesystemIsCaseSensitive

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testZeroByteFilesAreFiles

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testZeroByteFilesAreFiles(FileSystemContractBaseTest.java:562)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMultiByteFilesAreFiles

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameChildDirForbidden

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameChildDirForbidden(FileSystemContractBaseTest.java:614)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirToSelf

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirToSelf(FileSystemContractBaseTest.java:647)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileToSelf

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileToSelf(FileSystemContractBaseTest.java:679)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileToSelf(FileSystemContractBaseTest.java:679)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMoveFileUnderParent

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:343)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMoveFileUnderParent(FileSystemContractBaseTest.java:693)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testLSRootDir

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testLSRootDir(FileSystemContractBaseTest.java:703)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testLSRootDir(FileSystemContractBaseTest.java:703)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testListStatusRootDir

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":42478; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:396)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)


Stack Trace:
java.io.IOException: File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)

	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:373)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:351)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)

	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:60850 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:60850 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)



Hadoop-Hdfs-trunk - Build # 1560 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1560/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11586 lines...]
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:02.981s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.841s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.677s
[INFO] Finished at: Tue Oct 22 11:35:47 UTC 2013
[INFO] Final Memory: 45M/736M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3498690175466967346.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire497387182831793143tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_04799335426329602194tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-4885
Updating YARN-1288
Updating HADOOP-9291
Updating YARN-1258
Updating YARN-1315
Updating HDFS-5382
Updating HDFS-5347
Updating YARN-1331
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Hadoop-Hdfs-trunk - Build # 1562 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1562/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12032 lines...]
main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:21.066s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.843s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:23.761s
[INFO] Finished at: Thu Oct 24 13:18:10 UTC 2013
[INFO] Final Memory: 38M/323M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5363
Updating HADOOP-10064
Updating HDFS-5403
Updating HDFS-5341
Updating HDFS-5306
Updating HADOOP-9016
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
10 tests failed.
FAILED:  org.apache.hadoop.hdfs.TestNameNodeHttpServer.testSslConfiguration

Error Message:
Address already in use

Stack Trace:
java.net.BindException: Address already in use
	at java.net.PlainSocketImpl.socketBind(Native Method)
	at java.net.PlainSocketImpl.bind(PlainSocketImpl.java:383)
	at java.net.ServerSocket.bind(ServerSocket.java:328)
	at java.net.ServerSocket.<init>(ServerSocket.java:194)
	at javax.net.ssl.SSLServerSocket.<init>(SSLServerSocket.java:106)
	at com.sun.net.ssl.internal.ssl.SSLServerSocketImpl.<init>(SSLServerSocketImpl.java:108)
	at com.sun.net.ssl.internal.ssl.SSLServerSocketFactoryImpl.createServerSocket(SSLServerSocketFactoryImpl.java:72)
	at org.mortbay.jetty.security.SslSocketConnector.newServerSocket(SslSocketConnector.java:478)
	at org.mortbay.jetty.bio.SocketConnector.open(SocketConnector.java:73)
	at org.mortbay.jetty.AbstractConnector.doStart(AbstractConnector.java:283)
	at org.mortbay.jetty.bio.SocketConnector.doStart(SocketConnector.java:147)
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
	at org.mortbay.jetty.Server.doStart(Server.java:235)
	at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
	at org.apache.hadoop.http.HttpServer.start(HttpServer.java:808)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:121)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:630)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:487)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:688)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:673)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1262)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNode(MiniDFSCluster.java:895)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:786)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:644)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:334)
	at org.apache.hadoop.hdfs.MiniDFSCluster$Builder.build(MiniDFSCluster.java:316)
	at org.apache.hadoop.hdfs.TestNameNodeHttpServer.testSslConfiguration(TestNameNodeHttpServer.java:34)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:33047 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:33047 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteHalfABlock

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":49053; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Read timed out

Stack Trace:
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(SocketInputStream.java:129)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1195)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:318)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Hdfs-trunk - Build # 1564 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1564/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12029 lines...]
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:09.505s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [5.133s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:15.483s
[INFO] Finished at: Sat Oct 26 13:18:57 UTC 2013
[INFO] Final Memory: 36M/378M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HADOOP-10072
Updating HDFS-5257
Updating YARN-1333
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
10 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteEmptyFile

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteEmptyFile(FileSystemContractBaseTest.java:236)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteHalfABlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:52181 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:52181 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":54672; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Read timed out

Stack Trace:
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(SocketInputStream.java:129)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1195)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:318)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Hdfs-trunk - Build # 1565 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1565/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12848 lines...]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:46:02.708s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.325s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:46:05.882s
[INFO] Finished at: Sun Oct 27 13:20:42 UTC 2013
[INFO] Final Memory: 38M/352M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
11 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerWithNodeGroup

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.waitForMoveCompletion(Balancer.java:1169)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.dispatchBlockMoves(Balancer.java:1142)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1416)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1467)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.runBalancer(TestBalancerWithNodeGroup.java:173)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerWithNodeGroup(TestBalancerWithNodeGroup.java:302)


REGRESSION:  org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerEndInNoMoveProgress

Error Message:
null

Stack Trace:
java.lang.AssertionError: null
	at org.junit.Assert.fail(Assert.java:92)
	at org.junit.Assert.assertTrue(Assert.java:43)
	at org.junit.Assert.assertTrue(Assert.java:54)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.runBalancerCanFinish(TestBalancerWithNodeGroup.java:188)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerEndInNoMoveProgress(TestBalancerWithNodeGroup.java:347)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteHalfABlock

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:53392 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:53392 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":44590; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Read timed out

Stack Trace:
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(SocketInputStream.java:129)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1195)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:318)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Hadoop-Hdfs-trunk - Build # 1567 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1567/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12784 lines...]
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [49.486s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.953s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 52.300s
[INFO] Finished at: Tue Oct 29 11:34:36 UTC 2013
[INFO] Final Memory: 49M/658M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5090675212736170239.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire7161554381034715858tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_06648294810194451524tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-4949
Updating YARN-1109
Updating YARN-1349
Updating HDFS-5413
Updating MAPREDUCE-4680
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Jenkins build is back to normal : Hadoop-Hdfs-trunk #1570

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1570/changes>


Build failed in Jenkins: Hadoop-Hdfs-trunk #1569

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1569/changes>

Changes:

[tucu] YARN-1343. NodeManagers additions/restarts are not reported as node updates in AllocateResponse responses to AMs. (tucu)

[cnauroth] HDFS-4633. Change attribution in CHANGES.txt to version 2.2.1.

[vinodkv] YARN-1321. Changed NMTokenCache to support both singleton and an instance usage. Contributed by Alejandro Abdelnur.

[cnauroth] HDFS-5065. Change attribution in CHANGES.txt to version 2.2.1.

[cnauroth] YARN-1358. TestYarnCLI fails on Windows due to line endings. Contributed by Chuan Liu.

[cnauroth] YARN-1357. TestContainerLaunch.testContainerEnvVariables fails on Windows. Contributed by Chuan Liu.

[cnauroth] HDFS-5386. Add feature documentation for datanode caching. Contributed by Colin Patrick McCabe.

[cnauroth] HDFS-5432. TestDatanodeJsp fails on Windows due to assumption that loopback address resolves to host name localhost. Contributed by Chris Nauroth.

[wang] HDFS-5433. When reloading fsimage during checkpointing, we should clear existing snapshottable directories. Contributed by Aaron T. Myers.

------------------------------------------
[...truncated 12604 lines...]
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3910.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3923.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3936.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3949.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3962.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3975.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid3988.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4001.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4014.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4027.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4040.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4053.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4066.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4079.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4092.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4105.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4118.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4131.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4144.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4157.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4170.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4183.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4196.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4209.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4222.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4235.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4248.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4261.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4274.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4287.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4300.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid4313.log>

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [48.088s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.953s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 50.892s
[INFO] Finished at: Thu Oct 31 11:35:29 UTC 2013
[INFO] Final Memory: 50M/662M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5627261474589442945.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire2625886270624485890tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_0268192065729311462tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1343
Updating HDFS-5386
Updating YARN-1357
Updating YARN-1358
Updating HDFS-5432
Updating HDFS-5433
Updating HDFS-5065
Updating YARN-1321
Updating HDFS-4633

Hadoop-Hdfs-trunk - Build # 1569 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1569/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12797 lines...]
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [48.088s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.953s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 50.892s
[INFO] Finished at: Thu Oct 31 11:35:29 UTC 2013
[INFO] Final Memory: 50M/662M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5627261474589442945.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire2625886270624485890tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_0268192065729311462tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1343
Updating HDFS-5386
Updating YARN-1357
Updating YARN-1358
Updating HDFS-5432
Updating HDFS-5433
Updating HDFS-5065
Updating YARN-1321
Updating HDFS-4633
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1568

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1568/changes>

Changes:

[sandy] YARN-1306. Clean up hadoop-sls sample-conf according to YARN-1228 (Wei Yan via Sandy Ryza)

[arp] HDFS-5436. Move HsFtpFileSystem and HFtpFileSystem into org.apache.hdfs.web. (Contributed by Haohui Mai)

[bikas] Fix inadvertent file changes made via r1536888 (bikas)

[bikas] YARN-1068. Add admin support for HA operations (Karthik Kambatla via bikas)

[cmccabe] move HDFS-4657 to branch-2.2.1

[jlowe] MAPREDUCE-5598. TestUserDefinedCounters.testMapReduceJob is flakey. Contributed by Robert Kanter

[jlowe] MAPREDUCE-5596. Allow configuring the number of threads used to serve shuffle connections. Contributed by Sandy Ryza

------------------------------------------
[...truncated 12593 lines...]
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10670.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10683.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10696.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10709.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10722.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10735.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10749.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10762.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10775.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10788.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10801.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10814.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10827.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10840.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10853.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10866.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10879.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10892.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10905.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10918.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10931.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10944.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10957.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10970.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10983.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid10996.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11009.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11022.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11035.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11048.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11061.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid11074.log>

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [49.706s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.962s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 52.524s
[INFO] Finished at: Wed Oct 30 11:35:36 UTC 2013
[INFO] Final Memory: 45M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5581275443583162168.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire6322230193075923784tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_02739220727275155225tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1306
Updating YARN-1228
Updating MAPREDUCE-5596
Updating HDFS-5436
Updating YARN-1068
Updating MAPREDUCE-5598
Updating HDFS-4657

Hadoop-Hdfs-trunk - Build # 1568 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1568/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12786 lines...]
main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [49.706s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.962s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 52.524s
[INFO] Finished at: Wed Oct 30 11:35:36 UTC 2013
[INFO] Final Memory: 45M/724M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5581275443583162168.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire6322230193075923784tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_02739220727275155225tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1306
Updating YARN-1228
Updating MAPREDUCE-5596
Updating HDFS-5436
Updating YARN-1068
Updating MAPREDUCE-5598
Updating HDFS-4657
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1567

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1567/changes>

Changes:

[sandy] YARN-1109. Demote NodeManager "Sending out status for container" logs to debug (haosdent via Sandy Ryza)

[cmccabe] Merge HDFS-4949 branch back into trunk

[sandy] MAPREDUCE-4680. Job history cleaner should only check timestamps of files in old enough directories (Robert Kanter via Sandy Ryza)

[cnauroth] HDFS-5413. hdfs.cmd does not support passthrough to any arbitrary class. Contributed by Chris Nauroth.

[cnauroth] YARN-1349. yarn.cmd does not support passthrough to any arbitrary class. Contributed by Chris Nauroth.

------------------------------------------
[...truncated 12591 lines...]
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22438.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22449.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22460.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22471.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22482.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22493.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22504.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22515.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22526.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22537.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22548.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22559.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22570.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22581.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22592.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22603.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22614.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22625.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22636.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22647.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22658.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22669.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22680.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22691.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22702.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22713.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22724.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22735.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22746.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22757.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22768.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22779.log>
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Cannot create GC thread. Out of system resources.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/hs_err_pid22790.log>

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [49.486s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.953s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 52.300s
[INFO] Finished at: Tue Oct 29 11:34:36 UTC 2013
[INFO] Final Memory: 49M/658M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter5090675212736170239.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire7161554381034715858tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_06648294810194451524tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-4949
Updating YARN-1109
Updating YARN-1349
Updating HDFS-5413
Updating MAPREDUCE-4680

Build failed in Jenkins: Hadoop-Hdfs-trunk #1566

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1566/changes>

Changes:

[bikas] YARN-1022. Unnecessary INFO logs in AMRMClientAsync (haosdent via bikas)

------------------------------------------
[...truncated 12964 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.51 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.194 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.899 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.057 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.911 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.994 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.797 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.166 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.987 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.033 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.05 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.115 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.49 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.316 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.935 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.136 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 173.278 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.034 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.741 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.308 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.706 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.369 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.281 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.013 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.696 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.89 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.785 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.708 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.912 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.36 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.193 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.289 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.158 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.509 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.049 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.705 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.133 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.337 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.343 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.462 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.44 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.472 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.014 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.561 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.399 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.347 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.939 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.017 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.521 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.294 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.849 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.432 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.406 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.995 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.008 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.066 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 8.044 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestBlocksWithNotEnoughRacks.testCorruptBlockRereplicatedAcrossRacks:219 » Timeout
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteEmptyFile:236->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory:365->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingFile:375->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingDirectory:385->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory:407->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingFile:430->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory:439->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testInputStreamClosedTwice:461->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOutputStreamClosedTwice:473 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverWriteAndRead:509->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testFilesystemIsCaseSensitive:535 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testZeroByteFilesAreFiles:562 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMultiByteFilesAreFiles:575 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameChildDirForbidden:614->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirToSelf:647->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileToSelf:679->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMoveFileUnderParent:693->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testLSRootDir:703->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testListStatusRootDir:710->FileSystemContractBaseTest.createFile:484 » IO

Tests run: 2161, Failures: 0, Errors: 29, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [2:00:27.944s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.340s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00:32.308s
[INFO] Finished at: Mon Oct 28 13:35:20 UTC 2013
[INFO] Final Memory: 42M/426M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1022

Hadoop-Hdfs-trunk - Build # 1566 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1566/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 13157 lines...]
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [2:00:27.944s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.340s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:00:32.308s
[INFO] Finished at: Mon Oct 28 13:35:20 UTC 2013
[INFO] Final Memory: 42M/426M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1022
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
29 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks.testCorruptBlockRereplicatedAcrossRacks

Error Message:
Timed out waiting for corrupt replicas. Waiting for 1, but only found 0

Stack Trace:
java.util.concurrent.TimeoutException: Timed out waiting for corrupt replicas. Waiting for 1, but only found 0
	at org.apache.hadoop.hdfs.DFSTestUtil.waitCorruptReplicas(DFSTestUtil.java:351)
	at org.apache.hadoop.hdfs.server.blockmanagement.TestBlocksWithNotEnoughRacks.testCorruptBlockRereplicatedAcrossRacks(TestBlocksWithNotEnoughRacks.java:219)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteEmptyFile

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteEmptyFile(FileSystemContractBaseTest.java:236)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingFile

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingFile(FileSystemContractBaseTest.java:375)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingFile(FileSystemContractBaseTest.java:375)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingDirectory(FileSystemContractBaseTest.java:385)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingDirectory(FileSystemContractBaseTest.java:385)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryMoveToExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:407)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:407)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingFile

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingFile(FileSystemContractBaseTest.java:430)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingFile(FileSystemContractBaseTest.java:430)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory(FileSystemContractBaseTest.java:439)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory(FileSystemContractBaseTest.java:439)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testInputStreamClosedTwice

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testInputStreamClosedTwice(FileSystemContractBaseTest.java:461)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testInputStreamClosedTwice(FileSystemContractBaseTest.java:461)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOutputStreamClosedTwice

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverWriteAndRead

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverWriteAndRead(FileSystemContractBaseTest.java:509)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverWriteAndRead(FileSystemContractBaseTest.java:509)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testFilesystemIsCaseSensitive

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testZeroByteFilesAreFiles

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testZeroByteFilesAreFiles(FileSystemContractBaseTest.java:562)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testZeroByteFilesAreFiles(FileSystemContractBaseTest.java:562)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMultiByteFilesAreFiles

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameChildDirForbidden

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameChildDirForbidden(FileSystemContractBaseTest.java:614)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameChildDirForbidden(FileSystemContractBaseTest.java:614)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirToSelf

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirToSelf(FileSystemContractBaseTest.java:647)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirToSelf(FileSystemContractBaseTest.java:647)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileToSelf

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileToSelf(FileSystemContractBaseTest.java:679)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMoveFileUnderParent

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMoveFileUnderParent(FileSystemContractBaseTest.java:693)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMoveFileUnderParent(FileSystemContractBaseTest.java:693)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testLSRootDir

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testLSRootDir(FileSystemContractBaseTest.java:703)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testLSRootDir(FileSystemContractBaseTest.java:703)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testListStatusRootDir

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteHalfABlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock(FileSystemContractBaseTest.java:240)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:52376 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:52376 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":35964; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Build failed in Jenkins: Hadoop-Hdfs-trunk #1565

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1565/>

------------------------------------------
[...truncated 12655 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.354 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.923 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.306 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.371 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.541 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.094 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.952 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.466 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.194 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.819 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.153 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.068 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.719 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 67.206 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.314 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.375 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.876 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.627 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.075 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.199 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.32 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.442 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.519 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 103.714 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.637 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.052 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.413 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.738 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.396 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.315 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.02 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.245 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.772 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.662 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.385 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.83 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.366 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.175 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.295 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.229 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.093 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.571 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.911 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.965 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.559 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.198 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.434 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.501 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.672 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.464 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.991 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.602 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.013 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.405 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.049 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.018 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.787 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.399 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.021 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.637 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.38 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.668 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.917 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.066 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 8.409 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Failed tests: 
  TestBalancerWithNodeGroup.testBalancerEndInNoMoveProgress:347->runBalancerCanFinish:188 null

Tests in error: 
  TestBalancerWithNodeGroup.testBalancerWithNodeGroup:302->runBalancer:173 »  te...
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » SocketTimeout

Tests run: 2161, Failures: 1, Errors: 10, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:46:02.708s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.325s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:46:05.882s
[INFO] Finished at: Sun Oct 27 13:20:42 UTC 2013
[INFO] Final Memory: 38M/352M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Hadoop-Hdfs-trunk #1564

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1564/changes>

Changes:

[cnauroth] HADOOP-10072. TestNfsExports#testMultiMatchers fails due to non-deterministic timing around cache expiry check. Contributed by Chris Nauroth.

[sandy] YARN-1333: Add missing file SchedulerAppUtils

[sandy] YARN-1333. Support blacklisting in the Fair Scheduler (Tsuyoshi Ozawa via Sandy Ryza)

[jing9] HDFS-5257. addBlock() retry should return LocatedBlock with locations else client will get AIOBE. Contributed by Vinay.

------------------------------------------
[...truncated 11836 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.261 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.903 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.043 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.324 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.71 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.406 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.836 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.515 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.203 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.204 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.899 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.868 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.405 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.017 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.311 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.16 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.594 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.982 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.303 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.375 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.375 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.317 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.499 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.194 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 124.298 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.166 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.74 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.397 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.885 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.267 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.274 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 9.922 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.179 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.026 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.872 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.577 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.827 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.363 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.213 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.196 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.177 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.233 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.18 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.332 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.665 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.953 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.026 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.293 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.375 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.547 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.445 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.508 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.96 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.573 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.988 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.499 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.395 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.029 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.464 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.138 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.625 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.589 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.38 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.615 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.211 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.045 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.701 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteEmptyFile:236->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » SocketTimeout

Tests run: 2161, Failures: 0, Errors: 10, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:09.505s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [5.133s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:15.483s
[INFO] Finished at: Sat Oct 26 13:18:57 UTC 2013
[INFO] Final Memory: 36M/378M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HADOOP-10072
Updating HDFS-5257
Updating YARN-1333

Build failed in Jenkins: Hadoop-Hdfs-trunk #1563

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1563/changes>

Changes:

[brandonli] HDFS-5171. NFS should create input stream for a file and try to share it with multiple read requests. Contributed by Haohui Mai

[sandy] YARN-1335. Move duplicate code from FSSchedulerApp and FiCaSchedulerApp into SchedulerApplication (Sandy Ryza)

------------------------------------------
[...truncated 12606 lines...]
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.162 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.564 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.204 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.832 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.004 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.89 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.948 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.965 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.665 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.605 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.769 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.177 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.105 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.284 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.317 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.445 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.215 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 113.016 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.127 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.889 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.258 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.046 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.359 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.275 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.061 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.241 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.252 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.717 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.78 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.843 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.371 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.178 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.128 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.246 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.405 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.783 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.948 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.79 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.337 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.346 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.455 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.493 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.504 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.017 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.525 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.041 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.436 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.94 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.132 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.87 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.129 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.391 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.531 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.412 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.249 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.068 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.973 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory:365->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingFile:375->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingDirectory:385->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory:407->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingFile:430->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory:439->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testInputStreamClosedTwice:461->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOutputStreamClosedTwice:473 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverWriteAndRead:509->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testFilesystemIsCaseSensitive:535 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testZeroByteFilesAreFiles:562 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMultiByteFilesAreFiles:575 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameChildDirForbidden:614->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirToSelf:647->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileToSelf:679->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMoveFileUnderParent:693->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testLSRootDir:703->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testListStatusRootDir:710->FileSystemContractBaseTest.createFile:484 » IO

Tests run: 2160, Failures: 0, Errors: 26, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58:38.616s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.847s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:58:41.298s
[INFO] Finished at: Fri Oct 25 13:32:25 UTC 2013
[INFO] Final Memory: 46M/318M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1335
Updating HDFS-5171

Hadoop-Hdfs-trunk - Build # 1563 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1563/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12799 lines...]
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:58:38.616s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.847s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:58:41.298s
[INFO] Finished at: Fri Oct 25 13:32:25 UTC 2013
[INFO] Final Memory: 46M/318M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1335
Updating HDFS-5171
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
26 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory(FileSystemContractBaseTest.java:365)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingFile

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingFile(FileSystemContractBaseTest.java:375)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileAsExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingDirectory(FileSystemContractBaseTest.java:385)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileAsExistingDirectory(FileSystemContractBaseTest.java:385)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryMoveToExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:407)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory(FileSystemContractBaseTest.java:407)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingFile

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingFile(FileSystemContractBaseTest.java:430)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirectoryAsExistingDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory(FileSystemContractBaseTest.java:439)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory(FileSystemContractBaseTest.java:439)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testInputStreamClosedTwice

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testInputStreamClosedTwice(FileSystemContractBaseTest.java:461)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOutputStreamClosedTwice

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOutputStreamClosedTwice(FileSystemContractBaseTest.java:473)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverWriteAndRead

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverWriteAndRead(FileSystemContractBaseTest.java:509)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverWriteAndRead(FileSystemContractBaseTest.java:509)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testFilesystemIsCaseSensitive

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testFilesystemIsCaseSensitive(FileSystemContractBaseTest.java:535)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testZeroByteFilesAreFiles

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testZeroByteFilesAreFiles(FileSystemContractBaseTest.java:562)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testZeroByteFilesAreFiles(FileSystemContractBaseTest.java:562)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMultiByteFilesAreFiles

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMultiByteFilesAreFiles(FileSystemContractBaseTest.java:575)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameChildDirForbidden

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameChildDirForbidden(FileSystemContractBaseTest.java:614)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameDirToSelf

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameDirToSelf(FileSystemContractBaseTest.java:647)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileToSelf

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileToSelf(FileSystemContractBaseTest.java:679)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileToSelf(FileSystemContractBaseTest.java:679)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testMoveFileUnderParent

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMoveFileUnderParent(FileSystemContractBaseTest.java:693)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testMoveFileUnderParent(FileSystemContractBaseTest.java:693)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testLSRootDir

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testLSRootDir(FileSystemContractBaseTest.java:703)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testListStatusRootDir

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testListStatusRootDir(FileSystemContractBaseTest.java:710)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:46846 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:46846 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
 at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
 at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
 at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
 at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
 at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
 at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
 at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:396)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
 at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)


Stack Trace:
java.io.IOException: File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)

	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: File /test/hadoop/file could only be replicated to 0 nodes instead of minReplication (=1).  There are 2 datanode(s) running and 2 node(s) are excluded in this operation.
	at org.apache.hadoop.hdfs.server.blockmanagement.BlockManager.chooseTarget(BlockManager.java:1391)
	at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock(FSNamesystem.java:2528)
	at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.addBlock(NameNodeRpcServer.java:560)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.addBlock(ClientNamenodeProtocolServerSideTranslatorPB.java:387)
	at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java:59582)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:605)
	at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:932)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2056)
	at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2052)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:396)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1507)
	at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2050)

	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneAndAHalfBlocks

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks(FileSystemContractBaseTest.java:248)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteTwoBlocks

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks(FileSystemContractBaseTest.java:252)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testOverwrite

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testOverwrite(FileSystemContractBaseTest.java:271)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteInNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteInNonExistentDirectory(FileSystemContractBaseTest.java:295)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testDeleteRecursively

Error Message:
Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread

Stack Trace:
java.io.IOException: Unexpected HTTP response: code=500 != 201, op=CREATE, message=unable to create new native thread
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:330)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testDeleteRecursively(FileSystemContractBaseTest.java:313)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testRenameFileMoveToNonExistentDirectory

Error Message:
Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 

Stack Trace:
java.io.IOException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:27)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:513)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.toIOException(WebHdfsFileSystem.java:360)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:338)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed on local exception: java.io.IOException: Couldn't set up IO streams; Host Details : local host is: "asf005.sp2.ygridcore.net/67.195.138.27"; destination host is: "localhost":36983; 
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:337)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$700(WebHdfsFileSystem.java:110)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$1.close(WebHdfsFileSystem.java:830)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.createFile(FileSystemContractBaseTest.java:484)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory(FileSystemContractBaseTest.java:356)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)



Build failed in Jenkins: Hadoop-Hdfs-trunk #1562

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1562/changes>

Changes:

[jeagles] HADOOP-10064. Upgrade to maven antrun plugin version 1.7 (Arpit Agarwal via jeagles)

[kihwal] HDFS-5341. Reduce fsdataset lock duration during directory scanning. Contributed by Qus-Jiawei.

[jing9] HDFS-5363. Refactor WebHdfsFileSystem: move SPENGO-authenticated connection creation to URLConnectionFactory. Contributed by Haohui Mai.

[jeagles] HADOOP-9016. HarFsInputStream.skip(long) must never return negative value. (Ivan A. Veselovsky via jeagles)

[atm] HDFS-5403. WebHdfs client cannot communicate with older WebHdfs servers post HDFS-5306. Contributed by Aaron T. Myers.

------------------------------------------
[...truncated 11839 lines...]
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.429 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.204 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.332 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.531 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.256 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.417 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.193 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.738 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.903 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.991 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.961 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 70.327 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.615 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.794 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.815 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.214 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.68 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.443 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.316 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.422 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.132 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 124.263 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.083 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.67 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.089 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.74 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.352 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.317 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.479 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.573 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.174 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.831 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.546 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.852 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.382 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.194 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.308 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.23 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.157 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.69 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.717 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.94 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.003 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.354 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.336 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.509 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.425 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.521 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.925 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.626 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.413 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.296 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.401 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.082 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.725 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.4 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.367 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.39 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.519 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.726 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.207 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.936 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.841 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestNameNodeHttpServer.testSslConfiguration:34 » Bind Address already in use
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » SocketTimeout

Tests run: 2160, Failures: 0, Errors: 10, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:21.066s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.843s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:23.761s
[INFO] Finished at: Thu Oct 24 13:18:10 UTC 2013
[INFO] Final Memory: 38M/323M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5363
Updating HADOOP-10064
Updating HDFS-5403
Updating HDFS-5341
Updating HDFS-5306
Updating HADOOP-9016

Build failed in Jenkins: Hadoop-Hdfs-trunk #1561

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1561/changes>

Changes:

[jeagles] HADOOP-9598. Improve code coverage of RMAdminCLI (Aleksey Gorshkov and Andrey Klochkov via jeagles)

[bikas] YARN-1305. RMHAProtocolService#serviceInit should handle HAUtil's IllegalArgumentException (Tsuyoshi Ozawa via bikas)

[sandy] YARN-1330. Fair Scheduler: defaultQueueSchedulingPolicy does not take effect (Sandy Ryza)

[jlowe] MAPREDUCE-5561. org.apache.hadoop.mapreduce.v2.app.job.impl.TestJobImpl testcase failing on trunk. Contributed by Karthik Kambatla

[jeagles] YARN-1183. MiniYARNCluster shutdown takes several minutes intermittently (Andrey Klochkov via jeagles)

[cmccabe] HDFS-5400. DFS_CLIENT_MMAP_CACHE_THREAD_RUNS_PER_TIMEOUT constant is set to the wrong value (Colin Patrick McCabe)

------------------------------------------
[...truncated 11392 lines...]
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:03.933s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.778s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:06.567s
[INFO] Finished at: Wed Oct 23 11:35:55 UTC 2013
[INFO] Final Memory: 56M/806M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter6719342539522921382.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire8595823999250178453tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05623344360087798607tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1305
Updating HADOOP-9598
Updating HDFS-5400
Updating YARN-1183
Updating MAPREDUCE-5561
Updating YARN-1330

Hadoop-Hdfs-trunk - Build # 1561 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1561/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11585 lines...]

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:03.933s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.778s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:06.567s
[INFO] Finished at: Wed Oct 23 11:35:55 UTC 2013
[INFO] Final Memory: 56M/806M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter6719342539522921382.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire8595823999250178453tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05623344360087798607tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1305
Updating HADOOP-9598
Updating HDFS-5400
Updating YARN-1183
Updating MAPREDUCE-5561
Updating YARN-1330
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1560

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1560/changes>

Changes:

[sandy] YARN-1315. TestQueueACLs should also test FairScheduler (Sandy Ryza)

[cnauroth] YARN-1331. yarn.cmd exits with NoClassDefFoundError trying to run rmadmin or logs. Contributed by Chris Nauroth.

[jeagles] HADOOP-9291. enhance unit-test coverage of package o.a.h.metrics2 (Ivan A. Veselovsky via jeagles)

[szetszwo] HDFS-4885. Improve the verifyBlockPlacement() API in BlockPlacementPolicy. Contributed by Junping Du

[brandonli] HDFS-5347. Add HDFS NFS user guide. Contributed by Brandon Li

[jing9] HDFS-5382. Implement the UI of browsing filesystems in HTML 5 page. Contributed by Haohui Mai.

[sandy] YARN-1288. Make Fair Scheduler ACLs more user friendly (Sandy Ryza)

[sandy] YARN-1258: Move to 2.2.1 in CHANGES.txt

[sandy] Reverting "YARN-1258: Move to 2.2.1 in CHANGES.txt" because it contained unintended changes

[sandy] YARN-1258: Move to 2.2.1 in CHANGES.txt

------------------------------------------
[...truncated 11393 lines...]
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:02.981s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.841s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.677s
[INFO] Finished at: Tue Oct 22 11:35:47 UTC 2013
[INFO] Final Memory: 45M/736M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3498690175466967346.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire497387182831793143tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_04799335426329602194tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-4885
Updating YARN-1288
Updating HADOOP-9291
Updating YARN-1258
Updating YARN-1315
Updating HDFS-5382
Updating HDFS-5347
Updating YARN-1331

Build failed in Jenkins: Hadoop-Hdfs-trunk #1559

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1559/changes>

Changes:

[umamahesh] HDFS-5331 make SnapshotDiff.java to a o.a.h.util.Tool interface implementation. Contributed by Vinay.

------------------------------------------
[...truncated 11376 lines...]
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:02.764s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.909s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.527s
[INFO] Finished at: Mon Oct 21 11:35:53 UTC 2013
[INFO] Final Memory: 48M/685M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3875405237570897140.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire1273021569871565286tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05442859237382453126tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5331

Hadoop-Hdfs-trunk - Build # 1559 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1559/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11569 lines...]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:02.764s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.909s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.527s
[INFO] Finished at: Mon Oct 21 11:35:53 UTC 2013
[INFO] Final Memory: 48M/685M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3875405237570897140.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire1273021569871565286tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05442859237382453126tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5331
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1558

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1558/changes>

Changes:

[vinodkv] YARN-1185. Fixed FileSystemRMStateStore to not leave partial files that prevent subsequent ResourceManager recovery. Contributed by Omkar Vinit Joshi.

------------------------------------------
[...truncated 12510 lines...]
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [52.970s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.746s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 55.557s
[INFO] Finished at: Sun Oct 20 11:34:36 UTC 2013
[INFO] Final Memory: 55M/826M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter8825826774791749944.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire2324345505891715669tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_04863952880236005190tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1185

Hadoop-Hdfs-trunk - Build # 1558 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1558/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12703 lines...]
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [52.970s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.746s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 55.557s
[INFO] Finished at: Sun Oct 20 11:34:36 UTC 2013
[INFO] Final Memory: 55M/826M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter8825826774791749944.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire2324345505891715669tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_04863952880236005190tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating YARN-1185
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1557

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1557/changes>

Changes:

[cmccabe] HDFS-5276. FileSystem.Statistics should use thread-local counters to avoid multi-threaded performance issues on read/write.  (Colin Patrick McCabe)

[jeagles] MAPREDUCE-5587. TestTextOutputFormat fails on JDK7 (jeagles)

[sandy] MAPREDUCE-5457. Add a KeyOnlyTextOutputReader to enable streaming to write out text files without separators (Sandy Ryza)

[cnauroth] HDFS-5365. Move attribution to 2.2.1 in CHANGES.txt.

------------------------------------------
[...truncated 11377 lines...]
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:03.026s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.741s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.609s
[INFO] Finished at: Sat Oct 19 11:35:46 UTC 2013
[INFO] Final Memory: 47M/629M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3965599638341235819.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire6436526841667184575tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_02190178304488400303tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating MAPREDUCE-5587
Updating HDFS-5276
Updating HDFS-5365
Updating MAPREDUCE-5457

Hadoop-Hdfs-trunk - Build # 1557 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1557/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11570 lines...]
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:03.026s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.741s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:05.609s
[INFO] Finished at: Sat Oct 19 11:35:46 UTC 2013
[INFO] Final Memory: 47M/629M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3965599638341235819.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire6436526841667184575tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_02190178304488400303tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating MAPREDUCE-5587
Updating HDFS-5276
Updating HDFS-5365
Updating MAPREDUCE-5457
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1556

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1556/changes>

Changes:

[cnauroth] HDFS-5365. Fix libhdfs compile error on FreeBSD9. Contributed by Radim Kolar.

[jeagles] HDFS-4511. Cover package org.apache.hadoop.hdfs.tools with unit test (Andrey Klochkov via jeagles)

[suresh] HDFS-5374. Remove deadcode in DFSOutputStream. Contributed by Suresh Srinivas.

[cnauroth] HADOOP-10055. FileSystemShell.apt.vm doc has typo "numRepicas". Contributed by Akira Ajisaka.

[cnauroth] HDFS-5336. DataNode should not output StartupProgress metrics. Contributed by Akira Ajisaka.

[jing9] HDFS-5379. Update links to datanode information in dfshealth.html. Contributed by Haohui Mai.

[cnauroth] HDFS-5375. hdfs.cmd does not expose several snapshot commands. Contributed by Chris Nauroth.

------------------------------------------
[...truncated 12491 lines...]
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)
Error occurred during initialization of VM
java.lang.OutOfMemoryError: unable to create new native thread
	at java.lang.Thread.start0(Native Method)
	at java.lang.Thread.start(Thread.java:640)
	at java.lang.ref.Reference.<clinit>(Reference.java:145)

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [52.621s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.731s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 55.207s
[INFO] Finished at: Fri Oct 18 11:35:30 UTC 2013
[INFO] Final Memory: 49M/721M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs> && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3540211596762901003.jar> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire7378735037527787062tmp> <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05228567622829103228tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5375
Updating HDFS-5374
Updating HDFS-5379
Updating HDFS-4511
Updating HDFS-5336
Updating HADOOP-10055
Updating HDFS-5365

Hadoop-Hdfs-trunk - Build # 1556 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1556/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12684 lines...]
main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [52.621s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.731s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 55.207s
[INFO] Finished at: Fri Oct 18 11:35:30 UTC 2013
[INFO] Final Memory: 49M/721M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: ExecutionException; nested exception is java.util.concurrent.ExecutionException: java.lang.RuntimeException: The forked VM terminated without saying properly goodbye. VM crash or System.exit called ?
[ERROR] Command was/bin/sh -c cd /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs && /home/jenkins/tools/java/jdk1.6.0_26/jre/bin/java -Xmx1024m -XX:+HeapDumpOnOutOfMemoryError -jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefirebooter3540211596762901003.jar /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire7378735037527787062tmp /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire/surefire_05228567622829103228tmp
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5375
Updating HDFS-5374
Updating HDFS-5379
Updating HDFS-4511
Updating HDFS-5336
Updating HADOOP-10055
Updating HDFS-5365
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.

Build failed in Jenkins: Hadoop-Hdfs-trunk #1555

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1555/changes>

Changes:

[jing9] HDFS-5334. Implement dfshealth.jsp in HTML pages. Contributed by Haohui Mai.

[wang] Move HDFS-5360 to 2.2.1 in CHANGES.txt

[szetszwo] HDFS-4376. Fix race conditions in Balancer.  Contributed by Junping Du

[wang] HDFS-5360. Improvement of usage message of renameSnapshot and deleteSnapshot. Contributed by Shinichi Yamashita.

[jeagles] HADOOP-9078. enhance unit-test coverage of class org.apache.hadoop.fs.FileContext (Ivan A. Veselovsky via jeagles)

[kihwal] HDFS-5346. Avoid unnecessary call to getNumLiveDataNodes() for each block during IBR processing. Contributed by Ravi Prakash.

[suresh] HADOOP-10005. No need to check INFO severity level is enabled or not. Contributed by Jackie Chang.

[suresh] HDFS-5370. Typo in Error Message: different between range in condition and range in error message. Contributed by Kousuke Saruta.

[jeagles] MAPREDUCE-5586. TestCopyMapper#testCopyFailOnBlockSizeDifference fails when run from hadoop-tools/hadoop-distcp directory (jeagles)

[cnauroth] HADOOP-9897. Add method to get path start position without drive specifier in o.a.h.fs.Path. Contributed by Binglin Chang.

[jeagles] MAPREDUCE-5585. TestCopyCommitter#testNoCommitAction Fails on JDK7 (jeagles)

[szetszwo] Add TestOpenFilesWithSnapshot.java for HDFS-5283.

[szetszwo] HDFS-5283. Under construction blocks only inside snapshots should not be counted in safemode threshhold.  Contributed by Vinay

------------------------------------------
[...truncated 12526 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.123 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.021 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.957 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.859 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.539 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.912 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.764 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 43.597 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.413 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.318 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.528 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.942 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 103.049 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.056 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.752 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.524 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.634 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.364 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.258 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.302 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.497 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.27 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.956 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.618 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.961 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.361 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.19 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.385 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.227 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.302 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.076 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.571 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.272 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.345 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.556 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.712 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.478 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.08 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.473 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.975 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.413 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.161 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.083 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.517 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.387 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.173 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.687 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.346 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.781 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.228 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.136 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.781 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneAndAHalfBlocks:248->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteTwoBlocks:252->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverwrite:271->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteInNonExistentDirectory:295->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testDeleteRecursively:313->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToNonExistentDirectory:356->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileMoveToExistingDirectory:365->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingFile:375->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileAsExistingDirectory:385->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryMoveToExistingDirectory:407->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingFile:430->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirectoryAsExistingDirectory:439->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testInputStreamClosedTwice:461->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOutputStreamClosedTwice:473 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testOverWriteAndRead:509->FileSystemContractBaseTest.writeAndRead:797 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testFilesystemIsCaseSensitive:535 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testZeroByteFilesAreFiles:562 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMultiByteFilesAreFiles:575 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameChildDirForbidden:614->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameDirToSelf:647->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testRenameFileToSelf:679->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testMoveFileUnderParent:693->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testLSRootDir:703->FileSystemContractBaseTest.createFile:484 » IO
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testListStatusRootDir:710->FileSystemContractBaseTest.createFile:484 » IO

Tests run: 2145, Failures: 0, Errors: 26, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:20.427s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.907s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:25.192s
[INFO] Finished at: Thu Oct 17 13:19:07 UTC 2013
[INFO] Final Memory: 31M/322M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HDFS-5283
Updating MAPREDUCE-5586
Updating HDFS-5360
Updating MAPREDUCE-5585
Updating HADOOP-9897
Updating HADOOP-10005
Updating HDFS-5370
Updating HADOOP-9078
Updating HDFS-4376
Updating HDFS-5334
Updating HDFS-5346

Build failed in Jenkins: Hadoop-Hdfs-trunk #1554

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1554/changes>

Changes:

[jing9] HDFS-5130. Add test for snapshot related FsShell and DFSAdmin commands. Contributed by Binglin Chang.

[brandonli] HDFS-5330. fix readdir and readdirplus for large directories. Contributed by Brandon Li

[sandy] YARN-1295. In UnixLocalWrapperScriptBuilder, using bash -c can cause Text file busy errors. (Sandy Ryza)

[wang] HADOOP-100046. Print a log message when SSL is enabled (David S. Wang via wang)

[szetszwo] HDFS-5338. Add a conf to disable hostname check in datanode registration.

------------------------------------------
[...truncated 12206 lines...]
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 149.38 sec - in org.apache.hadoop.hdfs.TestDFSClientRetries
Running org.apache.hadoop.hdfs.TestListPathServlet
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.413 sec - in org.apache.hadoop.hdfs.TestListPathServlet
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 359.835 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.248 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.015 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.244 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.226 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.256 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.435 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.814 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.415 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.195 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.215 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.273 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.217 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.821 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.068 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.881 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.905 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.824 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.008 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.245 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.817 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.636 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.315 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.545 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.009 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 167.467 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.264 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.746 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.619 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.701 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.378 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.351 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.432 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.819 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.239 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.165 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.375 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.092 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.386 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.177 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.161 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.224 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.098 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.429 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 12.62 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.975 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.965 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.478 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.396 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.473 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.432 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.599 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.99 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.004 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.88 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.374 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.192 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.238 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.871 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.302 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.087 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.571 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.457 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.651 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.992 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 8.388 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestBalancerWithNodeGroup.testBalancerWithNodeGroup:302->runBalancer:173 »  te...
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2134, Failures: 0, Errors: 3, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:46.347s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.030s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:49.299s
[INFO] Finished at: Wed Oct 16 13:20:35 UTC 2013
[INFO] Final Memory: 40M/372M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.

Hadoop-Hdfs-trunk - Build # 1554 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1554/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 12399 lines...]
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:46.347s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.030s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:49.299s
[INFO] Finished at: Wed Oct 16 13:20:35 UTC 2013
[INFO] Final Memory: 40M/372M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Error updating JIRA issues. Saving issues for next build.
com.atlassian.jira.rpc.exception.RemotePermissionException: This issue does not exist or you don't have permission to view it.
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
3 tests failed.
REGRESSION:  org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerWithNodeGroup

Error Message:
test timed out after 60000 milliseconds

Stack Trace:
java.lang.Exception: test timed out after 60000 milliseconds
	at java.lang.Thread.sleep(Native Method)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.waitForMoveCompletion(Balancer.java:1153)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.dispatchBlockMoves(Balancer.java:1126)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1400)
	at org.apache.hadoop.hdfs.server.balancer.Balancer.run(Balancer.java:1451)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.runBalancer(TestBalancerWithNodeGroup.java:173)
	at org.apache.hadoop.hdfs.server.balancer.TestBalancerWithNodeGroup.testBalancerWithNodeGroup(TestBalancerWithNodeGroup.java:302)


REGRESSION:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testWriteReadAndDeleteOneBlock

Error Message:
Read timed out

Stack Trace:
java.net.SocketTimeoutException: Read timed out
	at java.net.SocketInputStream.socketRead0(Native Method)
	at java.net.SocketInputStream.read(SocketInputStream.java:129)
	at java.io.BufferedInputStream.fill(BufferedInputStream.java:218)
	at java.io.BufferedInputStream.read1(BufferedInputStream.java:258)
	at java.io.BufferedInputStream.read(BufferedInputStream.java:317)
	at sun.net.www.http.HttpClient.parseHTTPHeader(HttpClient.java:687)
	at sun.net.www.http.HttpClient.parseHTTP(HttpClient.java:632)
	at sun.net.www.protocol.http.HttpURLConnection.getInputStream(HttpURLConnection.java:1195)
	at java.net.HttpURLConnection.getResponseCode(HttpURLConnection.java:379)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:331)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeAndRead(FileSystemContractBaseTest.java:797)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.writeReadAndDelete(FileSystemContractBaseTest.java:263)
	at org.apache.hadoop.fs.FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock(FileSystemContractBaseTest.java:244)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
	at java.lang.reflect.Method.invoke(Method.java:597)
	at junit.framework.TestCase.runTest(TestCase.java:168)
	at junit.framework.TestCase.runBare(TestCase.java:134)
	at junit.framework.TestResult$1.protect(TestResult.java:110)
	at junit.framework.TestResult.runProtected(TestResult.java:128)
	at junit.framework.TestResult.run(TestResult.java:113)
	at junit.framework.TestCase.run(TestCase.java:124)
	at junit.framework.TestSuite.runTest(TestSuite.java:243)
	at junit.framework.TestSuite.run(TestSuite.java:238)
	at org.junit.internal.runners.JUnit38ClassRunner.run(JUnit38ClassRunner.java:83)
	at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:264)
	at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:153)
	at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:124)
	at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:200)
	at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:153)
	at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)


FAILED:  org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode

Error Message:
All datanodes 127.0.0.1:53114 are bad. Aborting...

Stack Trace:
java.io.IOException: All datanodes 127.0.0.1:53114 are bad. Aborting...
	at org.apache.hadoop.hdfs.web.JsonUtil.toRemoteException(JsonUtil.java:157)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.validateResponse(WebHdfsFileSystem.java:350)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem.access$900(WebHdfsFileSystem.java:112)
	at org.apache.hadoop.hdfs.web.WebHdfsFileSystem$2.close(WebHdfsFileSystem.java:849)
	at org.apache.hadoop.hdfs.AppendTestUtil.testAppend(AppendTestUtil.java:198)
	at org.apache.hadoop.hdfs.web.TestWebHdfsFileSystemContract.testResponseCode(TestWebHdfsFileSystemContract.java:465)



Build failed in Jenkins: Hadoop-Hdfs-trunk #1553

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1553/changes>

Changes:

[devaraj] MAPREDUCE-5518. Fixed typo "can't read paritions file". Contributed by Albert Chu.

[jing9] HDFS-5352. Server#initLog() doesn't close InputStream in httpfs. Contributed by Ted Yu.

[cnauroth] HADOOP-10040. Correct native.vcxproj for inclusion of lz4hc.c. Contributed by Chris Nauroth.

[sandy] YARN-1182. MiniYARNCluster creates and inits the RM/NM only on start() (Karthik Kambatla via Sandy Ryza)

[sandy] YARN-1259. In Fair Scheduler web UI, queue num pending and num active apps switched. (Robert Kanter via Sandy Ryza)

[jing9] HDFS-5342. Provide more information in the FSNamesystem JMX interfaces. Contributed by Haohui Mai.

[cnauroth] MAPREDUCE-5546. mapred.cmd on Windows set HADOOP_OPTS incorrectly. Contributed by Chuan Liu.

[jeagles] HADOOP-9494. Excluded auto-generated and examples code from clover reports (Andrey Klochkov via jeagles)

[cnauroth] HADOOP-10040. Correct merge error on native.vcxproj. Contributed by Chris Nauroth.

[cnauroth] HADOOP-10040. svn propset to native line endings on Windows files. Contributed by Chris Nauroth.

[cnauroth] HADOOP-10040. Revert svn propset for CRLF line endings on Windows files. Contributed by Chris Nauroth.

------------------------------------------
[...truncated 11406 lines...]
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 355.786 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.308 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.451 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 86.441 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.338 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.349 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.423 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.083 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.564 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.239 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.76 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.022 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.744 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 45.265 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.713 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.179 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.028 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.04 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.124 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.277 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.343 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.421 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.273 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 103.799 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.287 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.913 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.154 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.363 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.387 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.314 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.387 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.416 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.397 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.846 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.877 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.842 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.353 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.186 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.558 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.234 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.093 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.16 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.29 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.677 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.007 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.311 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.524 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.32 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.692 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.497 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.499 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.204 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.976 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.966 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.756 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.984 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.148 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.075 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.506 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.553 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.516 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.387 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.579 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.21 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.001 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 8.189 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2128, Failures: 0, Errors: 2, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:23.362s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [4.022s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:28.230s
[INFO] Finished at: Tue Oct 15 13:19:16 UTC 2013
[INFO] Final Memory: 40M/373M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Recording test results
Updating HADOOP-10040
Updating HADOOP-9494
Updating HDFS-5352
Updating MAPREDUCE-5518
Updating YARN-1259
Updating HDFS-5342
Updating YARN-1182
Updating MAPREDUCE-5546

Re: Build failed in Jenkins: Hadoop-Hdfs-trunk #1552

Posted by Ted Yu <yu...@gmail.com>.
I noticed test failures in 'Hadoop HDFS' caused tests in other modules to
be skipped:

[INFO] Apache Hadoop HDFS ................................ FAILURE
[1:44:29.257s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED

Should we apply '--fail-at-end' maven option so that all tests are run ?

See http://maven.apache.org/guides/mini/guide-multiple-modules.html

Cheers


On Mon, Oct 14, 2013 at 6:19 AM, Apache Jenkins Server <
jenkins@builds.apache.org> wrote:

> See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1552/changes>
>
> Changes:
>
> [sandy] MAPREDUCE-5463. Deprecate SLOTS_MILLIS counters. (Tzuyoshi Ozawa
> via Sandy Ryza)
>
> [sandy] YARN-305. Fair scheduler logs too many "Node offered to app"
> messages. (Lohit Vijayarenu via Sandy Ryza)
>
> [sseth] MAPREDUCE-5329. Allow MR applications to use additional
> AuxServices, which are compatible with the default MapReduce shuffle.
> Contributed by Avner BenHanoch.
>
> ------------------------------------------
> [...truncated 11425 lines...]
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.954 sec
> - in org.apache.hadoop.hdfs.TestSetrepIncreasing
> Running org.apache.hadoop.hdfs.TestEncryptedTransfer
> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.573
> sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
> Running org.apache.hadoop.hdfs.TestDFSUpgrade
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.229 sec
> - in org.apache.hadoop.hdfs.TestDFSUpgrade
> Running org.apache.hadoop.hdfs.TestCrcCorruption
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.306 sec
> - in org.apache.hadoop.hdfs.TestCrcCorruption
> Running org.apache.hadoop.hdfs.TestHFlush
> Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.177 sec
> - in org.apache.hadoop.hdfs.TestHFlush
> Running org.apache.hadoop.hdfs.TestFileAppendRestart
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.167 sec
> - in org.apache.hadoop.hdfs.TestFileAppendRestart
> Running org.apache.hadoop.hdfs.TestDatanodeReport
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.518 sec
> - in org.apache.hadoop.hdfs.TestDatanodeReport
> Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
> Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.195
> sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
> Running org.apache.hadoop.hdfs.TestFileInputStreamCache
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec
> - in org.apache.hadoop.hdfs.TestFileInputStreamCache
> Running org.apache.hadoop.hdfs.TestRestartDFS
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.3 sec -
> in org.apache.hadoop.hdfs.TestRestartDFS
> Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.577 sec
> - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
> Running org.apache.hadoop.hdfs.TestDFSRemove
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.897 sec
> - in org.apache.hadoop.hdfs.TestDFSRemove
> Running org.apache.hadoop.hdfs.TestHDFSTrash
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.345 sec
> - in org.apache.hadoop.hdfs.TestHDFSTrash
> Running org.apache.hadoop.hdfs.TestClientReportBadBlock
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.52 sec
> - in org.apache.hadoop.hdfs.TestClientReportBadBlock
> Running org.apache.hadoop.hdfs.TestQuota
> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.236 sec
> - in org.apache.hadoop.hdfs.TestQuota
> Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.421 sec
> - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
> Running org.apache.hadoop.hdfs.TestDatanodeRegistration
> Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.058 sec
> - in org.apache.hadoop.hdfs.TestDatanodeRegistration
> Running org.apache.hadoop.hdfs.TestAbandonBlock
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.214 sec
> - in org.apache.hadoop.hdfs.TestAbandonBlock
> Running org.apache.hadoop.hdfs.TestDFSShell
> Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.489
> sec - in org.apache.hadoop.hdfs.TestDFSShell
> Running org.apache.hadoop.hdfs.TestListFilesInDFS
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.339 sec
> - in org.apache.hadoop.hdfs.TestListFilesInDFS
> Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec
> - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
> Running org.apache.hadoop.hdfs.TestPeerCache
> Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.317 sec
> - in org.apache.hadoop.hdfs.TestPeerCache
> Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.434 sec
> - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
> Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.008 sec
> - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
> Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 152.644
> sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
> Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.954 sec
> - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
> Running org.apache.hadoop.hdfs.TestDFSAddressConfig
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.984 sec
> - in org.apache.hadoop.hdfs.TestDFSAddressConfig
> Running org.apache.hadoop.hdfs.TestMiniDFSCluster
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.416 sec
> - in org.apache.hadoop.hdfs.TestMiniDFSCluster
> Running org.apache.hadoop.hdfs.TestLease
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.635 sec
> - in org.apache.hadoop.hdfs.TestLease
> Running org.apache.hadoop.hdfs.TestListFilesInFileContext
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.309 sec
> - in org.apache.hadoop.hdfs.TestListFilesInFileContext
> Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.663 sec
> - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
> Running org.apache.hadoop.hdfs.TestDFSClientFailover
> Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.39 sec
> - in org.apache.hadoop.hdfs.TestDFSClientFailover
> Running org.apache.hadoop.hdfs.TestFileAppend2
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.319 sec
> - in org.apache.hadoop.hdfs.TestFileAppend2
> Running org.apache.hadoop.hdfs.TestLocalDFS
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.159 sec
> - in org.apache.hadoop.hdfs.TestLocalDFS
> Running org.apache.hadoop.hdfs.TestReadWhileWriting
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.298 sec
> - in org.apache.hadoop.hdfs.TestReadWhileWriting
> Running org.apache.hadoop.hdfs.TestSeekBug
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.443 sec
> - in org.apache.hadoop.hdfs.TestSeekBug
> Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.984 sec
> - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
> Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.39 sec -
> in org.apache.hadoop.hdfs.util.TestChunkedArrayList
> Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec
> - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
> Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.18 sec -
> in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
> Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
> Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec
> - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
> Running org.apache.hadoop.hdfs.util.TestDiff
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.333 sec
> - in org.apache.hadoop.hdfs.util.TestDiff
> Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
> Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec
> - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
> Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.103 sec
> - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
> Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
> Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.158 sec
> - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
> Running org.apache.hadoop.hdfs.util.TestXMLUtils
> Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec
> - in org.apache.hadoop.hdfs.util.TestXMLUtils
> Running org.apache.hadoop.hdfs.util.TestCyclicIteration
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec
> - in org.apache.hadoop.hdfs.util.TestCyclicIteration
> Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
> Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.164 sec
> - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
> Running org.apache.hadoop.hdfs.TestSetTimes
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.415 sec
> - in org.apache.hadoop.hdfs.TestSetTimes
> Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec
> - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
> Running org.apache.hadoop.hdfs.TestBlockReaderLocal
> Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.946
> sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
> Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec
> - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
> Running org.apache.hadoop.cli.TestHDFSCLI
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.825 sec
> - in org.apache.hadoop.cli.TestHDFSCLI
> Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec
> - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
> Running org.apache.hadoop.fs.TestGlobPaths
> Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.375 sec
> - in org.apache.hadoop.fs.TestGlobPaths
> Running org.apache.hadoop.fs.TestResolveHdfsSymlink
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.4 sec -
> in org.apache.hadoop.fs.TestResolveHdfsSymlink
> Running org.apache.hadoop.fs.TestFcHdfsSetUMask
> Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.497 sec
> - in org.apache.hadoop.fs.TestFcHdfsSetUMask
> Running org.apache.hadoop.fs.TestFcHdfsPermission
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.483 sec
> - in org.apache.hadoop.fs.TestFcHdfsPermission
> Running org.apache.hadoop.fs.TestUrlStreamHandler
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.508 sec
> - in org.apache.hadoop.fs.TestUrlStreamHandler
> Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.95 sec -
> in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
> Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
> Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.601 sec
> - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
> Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
> Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.992 sec
> - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
> Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
> Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.34 sec
> - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
> Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
> Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.368 sec
> - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
> Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.048 sec
> - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
> Running org.apache.hadoop.fs.permission.TestStickyBit
> Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.024 sec
> - in org.apache.hadoop.fs.permission.TestStickyBit
> Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
> Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.213 sec
> - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
> Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
> Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.883 sec
> - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
> Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.573 sec
> - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
> Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
> Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.414 sec
> - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
> Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
> Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.514 sec
> - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
> Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
> Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.211 sec
> - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
> Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
> Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.043 sec
> - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
> Running org.apache.hadoop.fs.TestVolumeId
> Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec
> - in org.apache.hadoop.fs.TestVolumeId
> Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
> Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.973 sec
> - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
>
> Results :
>
> Failed tests:
>   TestQuorumJournalManager.shutdown:108 Leaked thread: "IPC Client
> (4059123) connection to /127.0.0.1:57570 from jenkins" Id=87 RUNNABLE
>         at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:256)
>         at org.apache.hadoop.ipc.Client$Connection.close(Client.java:1137)
>         at org.apache.hadoop.ipc.Client$Connection.run(Client.java:954)
>
>
> org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:256)
> org.apache.hadoop.ipc.Client$Connection.close(Client.java:1137)
> org.apache.hadoop.ipc.Client$Connection.run(Client.java:954)
>
> Tests in error:
>   TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes
> 127.0.0....
>
> TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797
> » SocketTimeout
>
> Tests run: 2128, Failures: 1, Errors: 2, Skipped: 50
>
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Skipping Apache Hadoop HttpFS
> [INFO] This project has been banned from the build due to previous
> failures.
> [INFO]
> ------------------------------------------------------------------------
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
> [INFO] This project has been banned from the build due to previous
> failures.
> [INFO]
> ------------------------------------------------------------------------
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Skipping Apache Hadoop HDFS-NFS
> [INFO] This project has been banned from the build due to previous
> failures.
> [INFO]
> ------------------------------------------------------------------------
> [INFO]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
> [INFO]
> ------------------------------------------------------------------------
> [WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is
> missing, no dependency information available
> [WARNING] Failed to retrieve plugin descriptor for
> org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin
> org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could
> not be resolved: Failed to read artifact descriptor for
> org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
> [INFO]
> [INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @
> hadoop-hdfs-project ---
> [INFO] Deleting <
> https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target
> >
> [INFO]
> [INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @
> hadoop-hdfs-project ---
> [INFO] Executing tasks
>
> main:
>     [mkdir] Created dir: <
> https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir
> >
> [INFO] Executed tasks
> [INFO]
> [INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @
> hadoop-hdfs-project ---
> [INFO]
> [INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork
> (hadoop-java-sources) @ hadoop-hdfs-project ---
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @
> hadoop-hdfs-project ---
> [INFO]
> [INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @
> hadoop-hdfs-project ---
> [INFO]
> [INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @
> hadoop-hdfs-project ---
> [INFO] Not executing Javadoc as the project is not a Java
> classpath-capable package
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @
> hadoop-hdfs-project ---
> [INFO]
> [INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @
> hadoop-hdfs-project ---
> [INFO]
> [INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @
> hadoop-hdfs-project ---
> [INFO] ****** FindBugsMojo execute *******
> [INFO] canGenerate is false
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO]
> [INFO] Apache Hadoop HDFS ................................ FAILURE
> [1:44:29.257s]
> [INFO] Apache Hadoop HttpFS .............................. SKIPPED
> [INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
> [INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
> [INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.800s]
> [INFO]
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO]
> ------------------------------------------------------------------------
> [INFO] Total time: 1:44:31.893s
> [INFO] Finished at: Mon Oct 14 13:19:14 UTC 2013
> [INFO] Final Memory: 34M/402M
> [INFO]
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal
> org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on
> project hadoop-hdfs: There are test failures.
> [ERROR]
> [ERROR] Please refer to <
> https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports>
> for the individual test results.
> [ERROR] -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the
> -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions,
> please read the following articles:
> [ERROR] [Help 1]
> http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
> Build step 'Execute shell' marked build as failure
> Archiving artifacts
> Updating MAPREDUCE-5329
> Updating MAPREDUCE-5463
> Updating YARN-305
>

Build failed in Jenkins: Hadoop-Hdfs-trunk #1552

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1552/changes>

Changes:

[sandy] MAPREDUCE-5463. Deprecate SLOTS_MILLIS counters. (Tzuyoshi Ozawa via Sandy Ryza)

[sandy] YARN-305. Fair scheduler logs too many "Node offered to app" messages. (Lohit Vijayarenu via Sandy Ryza)

[sseth] MAPREDUCE-5329. Allow MR applications to use additional AuxServices, which are compatible with the default MapReduce shuffle. Contributed by Avner BenHanoch.

------------------------------------------
[...truncated 11425 lines...]
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.954 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.573 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.229 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.306 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.177 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.167 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.518 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.195 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.3 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.577 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.897 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.345 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.52 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.236 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.421 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.058 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.214 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.489 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.339 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.317 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.434 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.008 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 152.644 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.954 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.984 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.416 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.635 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.309 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.663 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.39 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.319 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.159 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.298 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.443 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.984 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.39 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.18 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.333 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.103 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.158 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.415 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.946 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.825 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.375 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.4 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.497 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.483 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.508 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.95 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.601 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.992 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.34 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.368 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.048 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.024 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.213 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.883 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.573 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.414 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.514 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.211 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.043 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.973 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Failed tests: 
  TestQuorumJournalManager.shutdown:108 Leaked thread: "IPC Client (4059123) connection to /127.0.0.1:57570 from jenkins" Id=87 RUNNABLE
	at org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:256)
	at org.apache.hadoop.ipc.Client$Connection.close(Client.java:1137)
	at org.apache.hadoop.ipc.Client$Connection.run(Client.java:954)


org.apache.hadoop.io.IOUtils.closeStream(IOUtils.java:256)
org.apache.hadoop.ipc.Client$Connection.close(Client.java:1137)
org.apache.hadoop.ipc.Client$Connection.run(Client.java:954)

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteOneBlock:244->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2128, Failures: 1, Errors: 2, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:29.257s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [1.800s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:31.893s
[INFO] Finished at: Mon Oct 14 13:19:14 UTC 2013
[INFO] Final Memory: 34M/402M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating MAPREDUCE-5329
Updating MAPREDUCE-5463
Updating YARN-305

Build failed in Jenkins: Hadoop-Hdfs-trunk #1551

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1551/changes>

Changes:

[brandonli] HDFS-5329. Update FSNamesystem#getListing() to handle inode path in startAfter token. Contributed by Brandon Li

------------------------------------------
[...truncated 11398 lines...]
Running org.apache.hadoop.hdfs.TestConnCache
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.196 sec - in org.apache.hadoop.hdfs.TestConnCache
Running org.apache.hadoop.hdfs.TestDFSClientRetries
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 149.921 sec - in org.apache.hadoop.hdfs.TestDFSClientRetries
Running org.apache.hadoop.hdfs.TestListPathServlet
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.387 sec - in org.apache.hadoop.hdfs.TestListPathServlet
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 352.899 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.21 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.876 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 80.64 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.983 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.476 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.04 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.21 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.445 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.24 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.62 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.48 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.946 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.995 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 69.807 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.835 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.775 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.007 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.722 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.906 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.35 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.316 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.335 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.997 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 99.623 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.764 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.868 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.18 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.146 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.44 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.322 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.29 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.271 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.745 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.245 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.556 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.865 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.361 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.138 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.251 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.09 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.158 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.597 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.614 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.002 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.201 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.54 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.293 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.553 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.406 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.529 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.977 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.585 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.939 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.338 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.178 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.97 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.449 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.838 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.926 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.473 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.351 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.658 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.046 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.066 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.993 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2128, Failures: 0, Errors: 2, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:15.686s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.127s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:18.659s
[INFO] Finished at: Sun Oct 13 13:19:57 UTC 2013
[INFO] Final Memory: 39M/344M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HDFS-5329

Build failed in Jenkins: Hadoop-Hdfs-trunk #1550

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1550/changes>

Changes:

[llu] Workaround git eol hell

[llu] Fix typo (HDFS-5276 -> HDFS-5267) in CHANGES.txt

[cnauroth] HADOOP-10040. hadoop.cmd in UNIX format and would not run by default on Windows. Contributed by Chris Nauroth.

[sandy] YARN-1044. used/min/max resources do not display info in the scheduler page (Sangjin Lee via Sandy Ryza)

[jing9] HDFS-5322. HDFS delegation token not found in cache errors seen on secure HA clusters. Contributed by Jing Zhao.

[sandy] Delete fair-scheduler-allocation.xml for YARN-1300

[sandy] YARN-1300. SLS tests fail because conf puts YARN properties in fair-scheduler.xml (Ted Yu via Sandy Ryza)

------------------------------------------
[...truncated 11401 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.31 sec - in org.apache.hadoop.hdfs.TestListPathServlet
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 374.029 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.347 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.911 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.84 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.066 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.327 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.7 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.028 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.592 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.206 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.821 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.829 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.013 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.038 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.419 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.554 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.728 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.116 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.251 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.94 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.51 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.321 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.48 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.162 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 106.444 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.73 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.842 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.419 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.163 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.435 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.342 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.984 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.042 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.456 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.937 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.825 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.032 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.361 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.198 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.186 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.275 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.226 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.092 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.059 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.165 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.52 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.993 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.986 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.612 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.419 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.328 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.517 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.511 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.69 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.122 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.737 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.049 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.54 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.435 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.111 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.13 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.2 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.22 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.454 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.809 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.715 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.208 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.979 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 8.382 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2127, Failures: 0, Errors: 2, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:45:00.623s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [2.057s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:45:03.529s
[INFO] Finished at: Sat Oct 12 13:19:44 UTC 2013
[INFO] Final Memory: 31M/296M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating HADOOP-10040
Updating HDFS-5322
Updating YARN-1044
Updating HDFS-5267
Updating YARN-1300
Updating HDFS-5276

Build failed in Jenkins: Hadoop-Hdfs-trunk #1549

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Hadoop-Hdfs-trunk/1549/changes>

Changes:

[llu] Move HDFS-5276 to 2.3.0 in CHANGES.txt

[llu] HDFS-5276. Remove volatile from LightWeightHashSet. (Junping Du via llu)

[llu] YARN-7. Support CPU resource for DistributedShell. (Junping Du via llu)

[jing9] HADOOP-10039. Add Hive to the list of projects using AbstractDelegationTokenSecretManager. Contributed by Haohui Mai.

[suresh] HDFS-5335. Hive query failed with possible race in dfs output stream. Contributed by Haohui Mai.

[sandy] YARN-1265. Fair Scheduler chokes on unhealthy node reconnect (Sandy Ryza)

[suresh] HADOOP-10029. Specifying har file to MR job fails in secure cluster. Contributed by Suresh Srinivas.

------------------------------------------
[...truncated 11402 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.415 sec - in org.apache.hadoop.hdfs.TestListPathServlet
Running org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.162 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitRead
Running org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 357.066 sec - in org.apache.hadoop.hdfs.TestDFSStorageStateRecovery
Running org.apache.hadoop.hdfs.TestFileCreationEmpty
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.277 sec - in org.apache.hadoop.hdfs.TestFileCreationEmpty
Running org.apache.hadoop.hdfs.TestSetrepIncreasing
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.854 sec - in org.apache.hadoop.hdfs.TestSetrepIncreasing
Running org.apache.hadoop.hdfs.TestEncryptedTransfer
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.338 sec - in org.apache.hadoop.hdfs.TestEncryptedTransfer
Running org.apache.hadoop.hdfs.TestDFSUpgrade
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.842 sec - in org.apache.hadoop.hdfs.TestDFSUpgrade
Running org.apache.hadoop.hdfs.TestCrcCorruption
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.38 sec - in org.apache.hadoop.hdfs.TestCrcCorruption
Running org.apache.hadoop.hdfs.TestHFlush
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.019 sec - in org.apache.hadoop.hdfs.TestHFlush
Running org.apache.hadoop.hdfs.TestFileAppendRestart
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.052 sec - in org.apache.hadoop.hdfs.TestFileAppendRestart
Running org.apache.hadoop.hdfs.TestDatanodeReport
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.508 sec - in org.apache.hadoop.hdfs.TestDatanodeReport
Running org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Tests run: 10, Failures: 0, Errors: 0, Skipped: 10, Time elapsed: 0.197 sec - in org.apache.hadoop.hdfs.TestShortCircuitLocalRead
Running org.apache.hadoop.hdfs.TestFileInputStreamCache
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.205 sec - in org.apache.hadoop.hdfs.TestFileInputStreamCache
Running org.apache.hadoop.hdfs.TestRestartDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.701 sec - in org.apache.hadoop.hdfs.TestRestartDFS
Running org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.734 sec - in org.apache.hadoop.hdfs.TestDFSUpgradeFromImage
Running org.apache.hadoop.hdfs.TestDFSRemove
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.127 sec - in org.apache.hadoop.hdfs.TestDFSRemove
Running org.apache.hadoop.hdfs.TestHDFSTrash
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.992 sec - in org.apache.hadoop.hdfs.TestHDFSTrash
Running org.apache.hadoop.hdfs.TestClientReportBadBlock
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 40.037 sec - in org.apache.hadoop.hdfs.TestClientReportBadBlock
Running org.apache.hadoop.hdfs.TestQuota
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.924 sec - in org.apache.hadoop.hdfs.TestQuota
Running org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.934 sec - in org.apache.hadoop.hdfs.TestFileLengthOnClusterRestart
Running org.apache.hadoop.hdfs.TestDatanodeRegistration
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.95 sec - in org.apache.hadoop.hdfs.TestDatanodeRegistration
Running org.apache.hadoop.hdfs.TestAbandonBlock
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.282 sec - in org.apache.hadoop.hdfs.TestAbandonBlock
Running org.apache.hadoop.hdfs.TestDFSShell
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.221 sec - in org.apache.hadoop.hdfs.TestDFSShell
Running org.apache.hadoop.hdfs.TestListFilesInDFS
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.31 sec - in org.apache.hadoop.hdfs.TestListFilesInDFS
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.163 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadUnCached
Running org.apache.hadoop.hdfs.TestPeerCache
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.316 sec - in org.apache.hadoop.hdfs.TestPeerCache
Running org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Tests run: 3, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 8.503 sec - in org.apache.hadoop.hdfs.TestAppendDifferentChecksum
Running org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.06 sec - in org.apache.hadoop.hdfs.TestDFSClientExcludedNodes
Running org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 154.607 sec - in org.apache.hadoop.hdfs.TestDatanodeBlockScanner
Running org.apache.hadoop.hdfs.TestMultiThreadedHflush
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.761 sec - in org.apache.hadoop.hdfs.TestMultiThreadedHflush
Running org.apache.hadoop.hdfs.TestDFSAddressConfig
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.813 sec - in org.apache.hadoop.hdfs.TestDFSAddressConfig
Running org.apache.hadoop.hdfs.TestMiniDFSCluster
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.509 sec - in org.apache.hadoop.hdfs.TestMiniDFSCluster
Running org.apache.hadoop.hdfs.TestLease
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.805 sec - in org.apache.hadoop.hdfs.TestLease
Running org.apache.hadoop.hdfs.TestListFilesInFileContext
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.391 sec - in org.apache.hadoop.hdfs.TestListFilesInFileContext
Running org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.275 sec - in org.apache.hadoop.hdfs.TestDFSShellGenericOptions
Running org.apache.hadoop.hdfs.TestDFSClientFailover
Tests run: 6, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 10.283 sec - in org.apache.hadoop.hdfs.TestDFSClientFailover
Running org.apache.hadoop.hdfs.TestFileAppend2
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.415 sec - in org.apache.hadoop.hdfs.TestFileAppend2
Running org.apache.hadoop.hdfs.TestLocalDFS
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.206 sec - in org.apache.hadoop.hdfs.TestLocalDFS
Running org.apache.hadoop.hdfs.TestReadWhileWriting
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.796 sec - in org.apache.hadoop.hdfs.TestReadWhileWriting
Running org.apache.hadoop.hdfs.TestSeekBug
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.666 sec - in org.apache.hadoop.hdfs.TestSeekBug
Running org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.868 sec - in org.apache.hadoop.hdfs.TestBlocksScheduledCounter
Running org.apache.hadoop.hdfs.util.TestChunkedArrayList
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.358 sec - in org.apache.hadoop.hdfs.util.TestChunkedArrayList
Running org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec - in org.apache.hadoop.hdfs.util.TestBestEffortLongFile
Running org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.199 sec - in org.apache.hadoop.hdfs.util.TestAtomicFileOutputStream
Running org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestExactSizeInputStream
Running org.apache.hadoop.hdfs.util.TestDiff
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.249 sec - in org.apache.hadoop.hdfs.util.TestDiff
Running org.apache.hadoop.hdfs.util.TestMD5FileUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.229 sec - in org.apache.hadoop.hdfs.util.TestMD5FileUtils
Running org.apache.hadoop.hdfs.util.TestDirectBufferPool
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.095 sec - in org.apache.hadoop.hdfs.util.TestDirectBufferPool
Running org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.159 sec - in org.apache.hadoop.hdfs.util.TestLightWeightHashSet
Running org.apache.hadoop.hdfs.util.TestXMLUtils
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestXMLUtils
Running org.apache.hadoop.hdfs.util.TestCyclicIteration
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.058 sec - in org.apache.hadoop.hdfs.util.TestCyclicIteration
Running org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.util.TestLightWeightLinkedSet
Running org.apache.hadoop.hdfs.TestSetTimes
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.307 sec - in org.apache.hadoop.hdfs.TestSetTimes
Running org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Tests run: 4, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 0.164 sec - in org.apache.hadoop.hdfs.TestParallelShortCircuitReadNoChecksum
Running org.apache.hadoop.hdfs.TestBlockReaderLocal
Tests run: 11, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 11.799 sec - in org.apache.hadoop.hdfs.TestBlockReaderLocal
Running org.apache.hadoop.hdfs.TestHftpURLTimeouts
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.992 sec - in org.apache.hadoop.hdfs.TestHftpURLTimeouts
Running org.apache.hadoop.cli.TestHDFSCLI
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.66 sec - in org.apache.hadoop.cli.TestHDFSCLI
Running org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.132 sec - in org.apache.hadoop.fs.TestHdfsNativeCodeLoader
Running org.apache.hadoop.fs.TestGlobPaths
Tests run: 31, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 4.343 sec - in org.apache.hadoop.fs.TestGlobPaths
Running org.apache.hadoop.fs.TestResolveHdfsSymlink
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.329 sec - in org.apache.hadoop.fs.TestResolveHdfsSymlink
Running org.apache.hadoop.fs.TestFcHdfsSetUMask
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.463 sec - in org.apache.hadoop.fs.TestFcHdfsSetUMask
Running org.apache.hadoop.fs.TestFcHdfsPermission
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.29 sec - in org.apache.hadoop.fs.TestFcHdfsPermission
Running org.apache.hadoop.fs.TestUrlStreamHandler
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.522 sec - in org.apache.hadoop.fs.TestUrlStreamHandler
Running org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.984 sec - in org.apache.hadoop.fs.viewfs.TestViewFsDefaultValue
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.677 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.042 sec - in org.apache.hadoop.fs.viewfs.TestViewFsHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Tests run: 42, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.401 sec - in org.apache.hadoop.fs.viewfs.TestViewFsAtHdfsRoot
Running org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Tests run: 39, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.306 sec - in org.apache.hadoop.fs.viewfs.TestViewFileSystemHdfs
Running org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.098 sec - in org.apache.hadoop.fs.viewfs.TestViewFsFileStatusHdfs
Running org.apache.hadoop.fs.permission.TestStickyBit
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.433 sec - in org.apache.hadoop.fs.permission.TestStickyBit
Running org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.503 sec - in org.apache.hadoop.fs.loadGenerator.TestLoadGenerator
Running org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Tests run: 69, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.952 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileContext
Running org.apache.hadoop.fs.shell.TestHdfsTextCommand
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.563 sec - in org.apache.hadoop.fs.shell.TestHdfsTextCommand
Running org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.341 sec - in org.apache.hadoop.fs.TestFcHdfsCreateMkdir
Running org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Tests run: 60, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.717 sec - in org.apache.hadoop.fs.TestHDFSFileContextMainOperations
Running org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Tests run: 7, Failures: 0, Errors: 0, Skipped: 6, Time elapsed: 0.209 sec - in org.apache.hadoop.fs.TestEnhancedByteBufferAccess
Running org.apache.hadoop.fs.TestSymlinkHdfsDisable
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.032 sec - in org.apache.hadoop.fs.TestSymlinkHdfsDisable
Running org.apache.hadoop.fs.TestVolumeId
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.066 sec - in org.apache.hadoop.fs.TestVolumeId
Running org.apache.hadoop.fs.TestSymlinkHdfsFileSystem
Tests run: 72, Failures: 0, Errors: 0, Skipped: 2, Time elapsed: 7.838 sec - in org.apache.hadoop.fs.TestSymlinkHdfsFileSystem

Results :

Tests in error: 
  TestWebHdfsFileSystemContract.testResponseCode:465 » IO All datanodes 127.0.0....
  TestWebHdfsFileSystemContract>FileSystemContractBaseTest.testWriteReadAndDeleteHalfABlock:240->FileSystemContractBaseTest.writeReadAndDelete:263->FileSystemContractBaseTest.writeAndRead:797 » SocketTimeout

Tests run: 2126, Failures: 0, Errors: 2, Skipped: 50

[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HttpFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS BookKeeper Journal
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Skipping Apache Hadoop HDFS-NFS
[INFO] This project has been banned from the build due to previous failures.
[INFO] ------------------------------------------------------------------------
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop HDFS Project 3.0.0-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[WARNING] The POM for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0 is missing, no dependency information available
[WARNING] Failed to retrieve plugin descriptor for org.eclipse.m2e:lifecycle-mapping:1.0.0: Plugin org.eclipse.m2e:lifecycle-mapping:1.0.0 or one of its dependencies could not be resolved: Failed to read artifact descriptor for org.eclipse.m2e:lifecycle-mapping:jar:1.0.0
[INFO] 
[INFO] --- maven-clean-plugin:2.4.1:clean (default-clean) @ hadoop-hdfs-project ---
[INFO] Deleting <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target>
[INFO] 
[INFO] --- maven-antrun-plugin:1.6:run (create-testdirs) @ hadoop-hdfs-project ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/target/test-dir>
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:25.467s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.993s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:30.306s
[INFO] Finished at: Fri Oct 11 13:19:16 UTC 2013
[INFO] Final Memory: 35M/397M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Hadoop-Hdfs-trunk/ws/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating YARN-1265
Updating HADOOP-10039
Updating HADOOP-10029
Updating HDFS-5335
Updating HDFS-5276
Updating YARN-7

Hadoop-Hdfs-trunk - Build # 1549 - Still Failing

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See https://builds.apache.org/job/Hadoop-Hdfs-trunk/1549/

###################################################################################
########################## LAST 60 LINES OF THE CONSOLE ###########################
[...truncated 11595 lines...]

main:
    [mkdir] Created dir: /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-source-plugin:2.1.2:test-jar-no-fork (hadoop-java-sources) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (dist-enforce) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-site-plugin:3.0:attach-descriptor (attach-descriptor) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-javadoc-plugin:2.8.1:jar (module-javadocs) @ hadoop-hdfs-project ---
[INFO] Not executing Javadoc as the project is not a Java classpath-capable package
[INFO] 
[INFO] --- maven-enforcer-plugin:1.3.1:enforce (depcheck) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.6:checkstyle (default-cli) @ hadoop-hdfs-project ---
[INFO] 
[INFO] --- findbugs-maven-plugin:2.3.2:findbugs (default-cli) @ hadoop-hdfs-project ---
[INFO] ****** FindBugsMojo execute *******
[INFO] canGenerate is false
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Hadoop HDFS ................................ FAILURE [1:44:25.467s]
[INFO] Apache Hadoop HttpFS .............................. SKIPPED
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SKIPPED
[INFO] Apache Hadoop HDFS-NFS ............................ SKIPPED
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [3.993s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:44:30.306s
[INFO] Finished at: Fri Oct 11 13:19:16 UTC 2013
[INFO] Final Memory: 35M/397M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.16:test (default-test) on project hadoop-hdfs: There are test failures.
[ERROR] 
[ERROR] Please refer to /home/jenkins/jenkins-slave/workspace/Hadoop-Hdfs-trunk/trunk/hadoop-hdfs-project/hadoop-hdfs/target/surefire-reports for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Build step 'Execute shell' marked build as failure
Archiving artifacts
Updating YARN-1265
Updating HADOOP-10039
Updating HADOOP-10029
Updating HDFS-5335
Updating HDFS-5276
Updating YARN-7
Sending e-mails to: hdfs-dev@hadoop.apache.org
Email was triggered for: Failure
Sending email for trigger: Failure



###################################################################################
############################## FAILED TESTS (if any) ##############################
No tests ran.