You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@hbase.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/11/02 10:47:24 UTC

Build failed in Jenkins: HBase-1.3 » latest1.8,Hadoop #335

See <https://builds.apache.org/job/HBase-1.3/jdk=latest1.8,label=Hadoop/335/changes>

Changes:

[stack] HBASE-14725 Vet categorization of tests so they for sure go into the

------------------------------------------
[...truncated 48416 lines...]

org.apache.hadoop.hbase.replication.TestReplicationKillSlaveRS.org.apache.hadoop.hbase.replication.TestReplicationKillSlaveRS
  Run 1: TestReplicationKillSlaveRS>TestReplicationBase.setUpBeforeClass:128 » IO Shutt...
  Run 2: TestReplicationKillSlaveRS>TestReplicationBase.tearDownAfterClass:158 » NullPointer

org.apache.hadoop.hbase.replication.TestReplicationSmallTests.org.apache.hadoop.hbase.replication.TestReplicationSmallTests
  Run 1: TestReplicationSmallTests>TestReplicationBase.setUpBeforeClass:108 » OutOfMemory
  Run 2: TestReplicationSmallTests>TestReplicationBase.tearDownAfterClass:158 » NullPointer

org.apache.hadoop.hbase.replication.multiwal.TestReplicationKillMasterRSCompressedWithMultipleWAL.org.apache.hadoop.hbase.replication.multiwal.TestReplicationKillMasterRSCompressedWithMultipleWAL
  Run 1: TestReplicationKillMasterRSCompressedWithMultipleWAL.setUpBeforeClass:32->TestReplicationKillMasterRSCompressed.setUpBeforeClass:39->TestReplicationBase.setUpBeforeClass:128 » OutOfMemory
  Run 2: TestReplicationKillMasterRSCompressedWithMultipleWAL>TestReplicationBase.tearDownAfterClass:158 » NullPointer

org.apache.hadoop.hbase.replication.multiwal.TestReplicationSyncUpToolWithMultipleWAL.org.apache.hadoop.hbase.replication.multiwal.TestReplicationSyncUpToolWithMultipleWAL
  Run 1: TestReplicationSyncUpToolWithMultipleWAL.setUpBeforeClass:32->TestReplicationBase.setUpBeforeClass:128 » IO
  Run 2: TestReplicationSyncUpToolWithMultipleWAL>TestReplicationBase.tearDownAfterClass:158 » NullPointer

  TestReplicationSink.org.apache.hadoop.hbase.replication.regionserver.TestReplicationSink » Remote
org.apache.hadoop.hbase.security.access.TestAccessController.org.apache.hadoop.hbase.security.access.TestAccessController
  Run 1: TestAccessController.setupBeforeClass:218 » IO Shutting down
  Run 2: TestAccessController.tearDownAfterClass:258->cleanUp:322->SecureTestUtil.deleteTable:696->SecureTestUtil.deleteTable:713 » NullPointer

org.apache.hadoop.hbase.security.access.TestAccessController3.org.apache.hadoop.hbase.security.access.TestAccessController3
  Run 1: TestAccessController3.setupBeforeClass:152 » IO Shutting down
  Run 2: TestAccessController3.tearDownAfterClass:197 NullPointer

  TestTablePermissions.beforeClass:98 » IO Shutting down
  TestExportSnapshot.setUpBeforeClass:95 » OutOfMemory unable to create new nati...
  TestExportSnapshot.tearDown:133 » TestTimedOut test timed out after 180 second...
  TestExportSnapshot.setUp:128 » HBaseSnapshot org.apache.hadoop.hbase.snapshot....
Flaked tests: 
org.apache.hadoop.hbase.TestZooKeeper.testRegionServerSessionExpired(org.apache.hadoop.hbase.TestZooKeeper)
  Run 1: TestZooKeeper.setUp:112 » IO Shutting down
  Run 2: TestZooKeeper.after:120 NullPointer
  Run 3: PASS

org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers.testEncodedSeeker[15](org.apache.hadoop.hbase.io.encoding.TestEncodedSeekers)
  Run 1: TestEncodedSeekers.testEncodedSeeker:115 » IO Cannot run program "chmod": erro...
  Run 2: PASS

org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction[17](org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite)
  Run 1: TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:458->testNotCachingDataBlocksDuringCompactionInternals:440 » OutOfMemory
  Run 2: PASS

org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction[18](org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite)
  Run 1: TestCacheOnWrite.testNotCachingDataBlocksDuringCompaction:458->testNotCachingDataBlocksDuringCompactionInternals:402 » OutOfMemory
  Run 2: PASS

org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheOnWrite[17](org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite)
  Run 1: TestCacheOnWrite.testStoreFileCacheOnWrite:452->testStoreFileCacheOnWriteInternals:239->writeStoreFile:360 » IO
  Run 2: PASS

org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite.testStoreFileCacheOnWrite[18](org.apache.hadoop.hbase.io.hfile.TestCacheOnWrite)
  Run 1: TestCacheOnWrite.testStoreFileCacheOnWrite:452->testStoreFileCacheOnWriteInternals:239->writeStoreFile:360 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat.testWithMapReduceMultiRegion(org.apache.hadoop.hbase.mapred.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat.testWithMapReduceMultiRegion:140->TableSnapshotInputFormatTestBase.testWithMapReduce:165->testWithMapReduceImpl:224->doTestWithMapReduce:234->TableSnapshotInputFormatTestBase.createTableAndSnapshot:202 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestCopyTable.testCopyTableWithBulkload(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
  Run 1: TestCopyTable.testCopyTableWithBulkload:131->doCopyTableTest:101 copy job failed expected:<0> but was:<1>
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestCopyTable.testRenameFamily(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
  Run 1: TestCopyTable.testRenameFamily:214 null
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterExclusiveColumn(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterExclusiveColumn:122->runRowCount:212 null
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestRowCounter.testRowCounterTimeRange(org.apache.hadoop.hbase.mapreduce.TestRowCounter)
  Run 1: TestRowCounter.testRowCounterTimeRange:187->runRowCount:212 null
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplit(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles>TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:173->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:217->TestLoadIncrementalHFiles.runTest:241->TestLoadIncrementalHFiles.runTest:254->TestLoadIncrementalHFiles.runTest:276 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingRowColBloom(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles.testRegionCrossingRowColBloom » Remote java.la...
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testSimpleHFileSplit(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles>TestLoadIncrementalHFiles.testSimpleHFileSplit:155->TestLoadIncrementalHFiles.runTest:245->TestLoadIncrementalHFiles.runTest:254->TestLoadIncrementalHFiles.runTest:293 » OutOfMemory
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1.testScanEmptyToBBB(org.apache.hadoop.hbase.mapreduce.TestTableInputFormatScan1)
  Run 1: TestTableInputFormatScan1.testScanEmptyToBBB:86->TestTableInputFormatScanBase.testScan:244 null
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.TestFailedAppendAndSync.testLockupAroundBadAssignSync(org.apache.hadoop.hbase.regionserver.TestFailedAppendAndSync)
  Run 1: TestFailedAppendAndSync.testLockupAroundBadAssignSync:242 » TestTimedOut test ...
  Run 2: PASS

org.apache.hadoop.hbase.replication.TestMultiSlaveReplication.testMultiSlaveReplication(org.apache.hadoop.hbase.replication.TestMultiSlaveReplication)
  Run 1: TestMultiSlaveReplication.testMultiSlaveReplication:128 » IO Shutting down
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationForFlushAndCompaction(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationForFlushAndCompaction:314 » RetriesExhaustedWithDetails
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDisabledTables(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDisabledTables:330->testRegionReplicaReplicationIgnoresDisabledTables:366 » Runtime
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables:335->testRegionReplicaReplicationIgnoresDisabledTables:348 » TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated:121 » RetriesExhausted
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable:151 » RetriesExhausted
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas:259->testRegionReplicaReplication:178 » RetriesExhausted
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas:249->testRegionReplicaReplication:178 » RetriesExhausted
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith3Replicas(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith3Replicas:254->testRegionReplicaReplication:200 » TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint)
  Run 1: TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication:269 » RetriesExhausted
  Run 2: PASS

org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpointNoMaster.testReplayCallable(org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpointNoMaster)
  Run 1: TestRegionReplicaReplicationEndpointNoMaster.testReplayCallable:176 » Runtime ...
  Run 2: PASS

org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient.testFlushTableSnapshot(org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient)
  Run 1: TestFlushSnapshotFromClient.testFlushTableSnapshot:147 » HBaseSnapshot org.apa...
  Run 2: PASS

org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient.testSnapshotStateAfterMerge(org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient)
  Run 1: TestFlushSnapshotFromClient.testSnapshotStateAfterMerge:328 » RestoreSnapshot ...
  Run 2: PASS

org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient.testTakeSnapshotAfterMerge(org.apache.hadoop.hbase.snapshot.TestFlushSnapshotFromClient)
  Run 1: TestFlushSnapshotFromClient.testTakeSnapshotAfterMerge » Remote java.lang.OutO...
  Run 2: PASS

org.apache.hadoop.hbase.snapshot.TestSecureExportSnapshot.testExportFileSystemState(org.apache.hadoop.hbase.snapshot.TestSecureExportSnapshot)
  Run 1: TestSecureExportSnapshot>TestExportSnapshot.testExportFileSystemState:198->TestExportSnapshot.testExportFileSystemState:266->TestExportSnapshot.testExportFileSystemState:294 expected:<0> but was:<1>
  Run 2: TestSecureExportSnapshot>TestExportSnapshot.setUp:117 » IO java.util.concurren...
  Run 3: TestSecureExportSnapshot>TestExportSnapshot.tearDown:133 » TableNotFound testt...
  Run 4: PASS


Tests run: 2118, Failures: 2, Errors: 31, Skipped: 36, Flakes: 31

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [1:04.720s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [5.478s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.315s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [1.485s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [21.796s]
[INFO] Apache HBase - Common ............................. SUCCESS [2:07.883s]
[INFO] Apache HBase - Procedure .......................... SUCCESS [1:00.292s]
[INFO] Apache HBase - Client ............................. SUCCESS [56.289s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [10.314s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [10.111s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [10.924s]
[INFO] Apache HBase - Server ............................. FAILURE [1:15:55.837s]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - External Block Cache ............... SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:22:22.516s
[INFO] Finished at: Mon Nov 02 09:44:05 UTC 2015
[INFO] Final Memory: 437M/982M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.maven.surefire.report.ReporterException: When writing xml report stdout/stderr: /tmp/stderr5758563952863884726deferred (No such file or directory) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  :     ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test' | wc -l`
  if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
    #It seems sometimes the tests are not dying immediately. Let's give them 30s
    echo "Suspicious java process found - waiting 30s to see if there are just slow to stop"
    sleep 30
    ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test' | wc -l`
    if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
      echo "There are $ZOMBIE_TESTS_COUNT zombie tests, they should have been killed by surefire but survived"
      echo "************ zombies jps listing"
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test'
      echo "************ BEGIN zombies jstack extract"
      # HBase tests have been flagged with an innocuous '-Dhbase.test' just so they can
      # be identified as hbase in a process listing.
      ZB_STACK=`jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack | grep ".test" | grep "\.java"`
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack
      echo "************ END  zombies jstack extract"
      JIRA_COMMENT="$JIRA_COMMENT

     {color:red}-1 core zombie tests{color}.  There are ${ZOMBIE_TESTS_COUNT} zombie test(s): ${ZB_STACK}"
      BAD=1
      # Killing these zombies
      echo 'Killing ZOMBIES!!!'
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs kill -9
    else
      echo "We're ok: there is no zombie test, but some tests took some time to stop"
    fi
[Hadoop] $ /bin/bash -xe /tmp/hudson7292855548474512630.sh
++ wc -l
++ grep -Dhbase.test
++ grep surefirebooter
grep: unknown devices method
++ jps -v
grep: write error: Broken pipe
+ ZOMBIE_TESTS_COUNT=0
/tmp/hudson7292855548474512630.sh: line 30: syntax error: unexpected end of file
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results

Jenkins build is back to normal : HBase-1.3 » latest1.8,Hadoop #336

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/HBase-1.3/jdk=latest1.8,label=Hadoop/336/changes>