You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@hbase.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/11/02 11:11:31 UTC

Build failed in Jenkins: HBase-1.2 » latest1.8,Hadoop #337

See <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/337/changes>

Changes:

[stack] HBASE-14725 Vet categorization of tests so they for sure go into the

------------------------------------------
[...truncated 49187 lines...]
org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor.testIllegalTableDescriptor(org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor)
  Run 1: TestFromClientSideWithCoprocessor.testIllegalTableDescriptor » Remote unable t...
  Run 2: PASS

org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor.testKeepDeletedCells(org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor)
  Run 1: TestFromClientSideWithCoprocessor>TestFromClientSide.testKeepDeletedCells:197 » Runtime
  Run 2: PASS

org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor.testPutNoCF(org.apache.hadoop.hbase.client.TestFromClientSideWithCoprocessor)
  Run 1: TestFromClientSideWithCoprocessor.testPutNoCF » Remote unable to create new na...
  Run 2: PASS

org.apache.hadoop.hbase.client.TestMetaScanner.testMetaScanner(org.apache.hadoop.hbase.client.TestMetaScanner)
  Run 1: TestMetaScanner.testMetaScanner:77 » IO java.lang.reflect.InvocationTargetExce...
  Run 2: PASS

org.apache.hadoop.hbase.client.TestMultipleTimestamps.testReseeksWithMultipleFiles(org.apache.hadoop.hbase.client.TestMultipleTimestamps)
  Run 1: TestMultipleTimestamps.testReseeksWithMultipleFiles » Remote java.lang.OutOfMe...
  Run 2: TestMultipleTimestamps.testReseeksWithMultipleFiles:227 Waiting timed out after [60,000] msec table enabled in zk,
  Run 3: PASS

org.apache.hadoop.hbase.client.TestMultipleTimestamps.testWithVersionDeletes(org.apache.hadoop.hbase.client.TestMultipleTimestamps)
  Run 1: TestMultipleTimestamps.testWithVersionDeletes:282->testWithVersionDeletes:297 » IO
  Run 2: TestMultipleTimestamps.testWithVersionDeletes » Remote unable to create new na...
  Run 3: PASS

org.apache.hadoop.hbase.client.TestReplicasClient.testSmallScanWithReplicas(org.apache.hadoop.hbase.client.TestReplicasClient)
  Run 1: TestReplicasClient.testSmallScanWithReplicas:611->runMultipleScansOfOneType:721->scanWithReplicas:770 » Runtime
  Run 2: PASS

org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas.testOfflineTableSnapshot(org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas)
  Run 1: TestSnapshotFromClientWithRegionReplicas.testOfflineTableSnapshot » Remote jav...
  Run 2: TestSnapshotFromClientWithRegionReplicas.testOfflineTableSnapshot » Remote jav...
  Run 3: PASS

org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas.testSnapshotDeletionWithRegex(org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas)
  Run 1: TestSnapshotFromClientWithRegionReplicas>TestSnapshotFromClient.setup:99 » IO ...
  Run 2: TestSnapshotFromClientWithRegionReplicas.testSnapshotDeletionWithRegex » Remote
  Run 3: PASS

org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas.testSnapshotFailsOnNonExistantTable(org.apache.hadoop.hbase.client.TestSnapshotFromClientWithRegionReplicas)
  Run 1: TestSnapshotFromClientWithRegionReplicas>TestSnapshotFromClient.setup:99 » TableExists
  Run 2: TestSnapshotFromClientWithRegionReplicas.testSnapshotFailsOnNonExistantTable » Remote
  Run 3: PASS

org.apache.hadoop.hbase.client.replication.TestReplicationAdminWithClusters.testEnableReplicationWhenSlaveClusterDoesntHaveTable(org.apache.hadoop.hbase.client.replication.TestReplicationAdminWithClusters)
  Run 1: TestReplicationAdminWithClusters.testEnableReplicationWhenSlaveClusterDoesntHaveTable:68 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapred.TestTableMapReduceUtil.shoudBeValidMapReduceEvaluation(org.apache.hadoop.hbase.mapred.TestTableMapReduceUtil)
  Run 1: TestTableMapReduceUtil.shoudBeValidMapReduceEvaluation:183 » IO Job failed!
  Run 2: TestTableMapReduceUtil.shoudBeValidMapReduceEvaluation:183 » OutOfMemory unabl...
  Run 3: PASS

org.apache.hadoop.hbase.mapred.TestTableMapReduceUtil.shoudBeValidMapReduceWithPartitionerEvaluation(org.apache.hadoop.hbase.mapred.TestTableMapReduceUtil)
  Run 1: TestTableMapReduceUtil.shoudBeValidMapReduceWithPartitionerEvaluation:206 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestCellCounter.testCellCounteOutOfTimeRange(org.apache.hadoop.hbase.mapreduce.TestCellCounter)
  Run 1: TestCellCounter.testCellCounteOutOfTimeRange:217->runCount:240 » IO Unable to ...
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestCopyTable.testStartStopRow(org.apache.hadoop.hbase.mapreduce.TestCopyTable)
  Run 1: TestCopyTable.testStartStopRow:161 » IO javax.xml.transform.TransformerExcepti...
  Run 2: TestCopyTable.testStartStopRow » Remote java.lang.OutOfMemoryError: unable to ...
  Run 3: PASS

org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat.testMRIncrementalLoad(org.apache.hadoop.hbase.mapreduce.TestHFileOutputFormat)
  Run 1: TestHFileOutputFormat.testMRIncrementalLoad:376->doIncrementalLoadTest:405 » YarnRuntime
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestImportTSVWithVisibilityLabels.testMROnTableWithDeletes(org.apache.hadoop.hbase.mapreduce.TestImportTSVWithVisibilityLabels)
  Run 1: TestImportTSVWithVisibilityLabels.testMROnTableWithDeletes:185->doMROnTableTest:320->doMROnTableTest:363 expected:<0> but was:<1>
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanEmptyToAPP(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat)
  Run 1: TestMultiTableInputFormat.testScanEmptyToAPP:186->testScan:248 null
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat.testScanEmptyToEmpty(org.apache.hadoop.hbase.mapreduce.TestMultiTableInputFormat)
  Run 1: TestMultiTableInputFormat.testScanEmptyToEmpty:180->testScan:248 null
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testNonHfileFolderWithUnmatchedFamilyName(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles.testNonHfileFolderWithUnmatchedFamilyName » Remote
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplit(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles>TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:173->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:195->TestLoadIncrementalHFiles.runTest:219->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:265 » IllegalState
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom » OutOfMemory
  Run 2: TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom » Remote ...
  Run 3: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom » Remote
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingLoad(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles>TestLoadIncrementalHFiles.testRegionCrossingLoad:118->TestLoadIncrementalHFiles.runTest:209->TestLoadIncrementalHFiles.runTest:219->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:248 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testTableWithCFNameStartWithUnderScore(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles.testTableWithCFNameStartWithUnderScore » OutOfMemory
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceAndOfflineHBaseMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceAndOfflineHBaseMultiRegion:111->TableSnapshotInputFormatTestBase.testWithMapReduce:160->TableSnapshotInputFormatTestBase.setupCluster:63 » IO
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat.testWithMapReduceMultiRegion(org.apache.hadoop.hbase.mapreduce.TestTableSnapshotInputFormat)
  Run 1: TestTableSnapshotInputFormat>TableSnapshotInputFormatTestBase.testWithMapReduceMultiRegion:105->TableSnapshotInputFormatTestBase.testWithMapReduce:161 » YarnRuntime
  Run 2: PASS

org.apache.hadoop.hbase.master.TestAssignmentListener.testAssignmentListener(org.apache.hadoop.hbase.master.TestAssignmentListener)
  Run 1: TestAssignmentListener.testAssignmentListener:248 » Runtime java.lang.OutOfMem...
  Run 2: PASS

org.apache.hadoop.hbase.migration.TestNamespaceUpgrade.testSnapshots(org.apache.hadoop.hbase.migration.TestNamespaceUpgrade)
  Run 1: TestNamespaceUpgrade.testSnapshots:201 » RestoreSnapshot org.apache.hadoop.hba...
  Run 2: PASS

org.apache.hadoop.hbase.regionserver.TestSplitWalDataLoss.test(org.apache.hadoop.hbase.regionserver.TestSplitWalDataLoss)
  Run 1: TestSplitWalDataLoss.test:121 » RetriesExhaustedWithDetails Failed 1 action: T...
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testFixAssignmentsAndNoHdfsChecking(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testFixAssignmentsAndNoHdfsChecking:1886 expected:<[NOT_DEPLOYED, HOLE_IN_REGION_CHAIN]> but was:<[NOT_DEPLOYED, NOT_DEPLOYED, HOLE_IN_REGION_CHAIN, HOLE_IN_REGION_CHAIN]>
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testFixAssignmentsWhenMETAinTransition(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testFixAssignmentsWhenMETAinTransition:286 expected:<[]> but was:<[NOT_DEPLOYED, HOLE_IN_REGION_CHAIN]>
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testHbckAfterRegionMerge(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testHbckAfterRegionMerge:2646 » Runtime java.lang.OutOfMemoryErr...
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testLingeringReferenceFile(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testLingeringReferenceFile:2254 expected:<[LINGERING_REFERENCE_HFILE]> but was:<[NOT_DEPLOYED, HOLE_IN_REGION_CHAIN, LINGERING_REFERENCE_HFILE]>
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testMetaOffline(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testMetaOffline:2538 expected:<[]> but was:<[NOT_DEPLOYED, HOLE_IN_REGION_CHAIN]>
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testNotInHdfsWithReplicas(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testNotInHdfsWithReplicas:1401->cleanupTable:474->deleteTable:2810 » TableNotFound
  Run 2: PASS

org.apache.hadoop.hbase.util.TestHBaseFsck.testRegionHole(org.apache.hadoop.hbase.util.TestHBaseFsck)
  Run 1: TestHBaseFsck.testRegionHole:1188->cleanupTable:474->deleteTable:2810 » TableNotFound
  Run 2: PASS


Tests run: 2170, Failures: 2, Errors: 46, Skipped: 40, Flakes: 38

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [1:43.256s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [8.704s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.144s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [0.915s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [18.084s]
[INFO] Apache HBase - Common ............................. SUCCESS [2:53.041s]
[INFO] Apache HBase - Procedure .......................... SUCCESS [54.805s]
[INFO] Apache HBase - Client ............................. SUCCESS [54.499s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [8.484s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [10.628s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [9.120s]
[INFO] Apache HBase - Server ............................. FAILURE [1:14:24.150s]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - External Block Cache ............... SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:22:16.238s
[INFO] Finished at: Mon Nov 02 10:09:12 UTC 2015
[INFO] Final Memory: 429M/787M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException: The forked VM terminated without properly saying goodbye. VM crash or System.exit called?
[ERROR] Command was /bin/sh -c cd <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/ws/hbase-server> && /home/jenkins/jenkins-slave/tools/hudson.model.JDK/latest1.8/jre/bin/java -enableassertions -XX:MaxDirectMemorySize=1G -Xmx2800m -XX:MaxPermSize=256m -Djava.security.egd=file:/dev/./urandom -Djava.net.preferIPv4Stack=true -Djava.awt.headless=true -jar <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/ws/hbase-server/target/surefire/surefirebooter7013074389245625573.jar> <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/ws/hbase-server/target/surefire/surefire6239168594083639353tmp> <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/ws/hbase-server/target/surefire/surefire_7643545192243342603585tmp>
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  :     ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test' | wc -l`
  if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
    #It seems sometimes the tests are not dying immediately. Let's give them 30s
    echo "Suspicious java process found - waiting 30s to see if there are just slow to stop"
    sleep 30
    ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep '-Dhbase.test' | wc -l`
    if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
      echo "There are $ZOMBIE_TESTS_COUNT zombie tests, they should have been killed by surefire but survived"
      echo "************ zombies jps listing"
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test'
      echo "************ BEGIN zombies jstack extract"
      # HBase tests have been flagged with an innocuous '-Dhbase.test' just so they can
      # be identified as hbase in a process listing.
      ZB_STACK=`jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack | grep ".test" | grep "\.java"`
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack
      echo "************ END  zombies jstack extract"
      JIRA_COMMENT="$JIRA_COMMENT

     {color:red}-1 core zombie tests{color}.  There are ${ZOMBIE_TESTS_COUNT} zombie test(s): ${ZB_STACK}"
      BAD=1
      # Killing these zombies
      echo 'Killing ZOMBIES!!!'
      jps -v
      jps -v | grep surefirebooter | grep '-Dhbase.test' | cut -d ' ' -f 1 | xargs kill -9
    else
      echo "We're ok: there is no zombie test, but some tests took some time to stop"
    fi
[Hadoop] $ /bin/bash -xe /tmp/hudson7143417065191595431.sh
++ jps -v
++ grep -Dhbase.test
grep: unknown devices method
++ grep surefirebooter
++ wc -l
grep: write error: Broken pipe
+ ZOMBIE_TESTS_COUNT=0
/tmp/hudson7143417065191595431.sh: line 30: syntax error: unexpected end of file
POST BUILD TASK : FAILURE
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results

Jenkins build is back to normal : HBase-1.2 » latest1.8,Hadoop #338

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/HBase-1.2/jdk=latest1.8,label=Hadoop/338/changes>