You are viewing a plain text version of this content. The canonical link for it is here.
Posted to builds@hbase.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/12/15 09:03:27 UTC

Build failed in Jenkins: HBase-1.1-JDK8 #1707

See <https://builds.apache.org/job/HBase-1.1-JDK8/1707/changes>

Changes:

[jmhsieh] HBASE-14929 There is a space missing from table "foo" is not currently

------------------------------------------
[...truncated 3864 lines...]
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:422)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hbase.util.ForeignExceptionUtil.toIOException(ForeignExceptionUtil.java:45)
	at org.apache.hadoop.hbase.client.HBaseAdmin$ProcedureFuture.convertResult(HBaseAdmin.java:4206)
	at org.apache.hadoop.hbase.client.HBaseAdmin$ProcedureFuture.waitProcedureResult(HBaseAdmin.java:4164)
	at org.apache.hadoop.hbase.client.HBaseAdmin$ProcedureFuture.get(HBaseAdmin.java:4098)
	at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:560)
	at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:490)
	at org.apache.hadoop.hbase.replication.regionserver.TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable(TestRegionReplicaReplicationEndpoint.java:151)
Caused by: org.apache.hadoop.ipc.RemoteException: Failed after attempts=5, exceptions:
Tue Dec 15 07:50:50 UTC 2015, RpcRetryingCaller{globalStartTime=1450165850167, pause=100, retries=5}, org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: hemera.apache.org/140.211.11.27:52224
Tue Dec 15 07:50:50 UTC 2015, RpcRetryingCaller{globalStartTime=1450165850167, pause=100, retries=5}, org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: hemera.apache.org/140.211.11.27:52224
Tue Dec 15 07:50:50 UTC 2015, RpcRetryingCaller{globalStartTime=1450165850167, pause=100, retries=5}, org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: hemera.apache.org/140.211.11.27:52224
Tue Dec 15 07:50:51 UTC 2015, RpcRetryingCaller{globalStartTime=1450165850167, pause=100, retries=5}, org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: hemera.apache.org/140.211.11.27:52224
Tue Dec 15 07:50:53 UTC 2015, RpcRetryingCaller{globalStartTime=1450165850167, pause=100, retries=5}, org.apache.hadoop.hbase.ipc.FailedServerException: This server is in the failed servers list: hemera.apache.org/140.211.11.27:52224

	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
	at org.apache.hadoop.hbase.client.ResultBoundedCompletionService$QueueingFuture.run(ResultBoundedCompletionService.java:65)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
	at java.lang.Thread.run(Thread.java:745)

Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 103.217 sec - in org.apache.hadoop.hbase.util.TestMiniClusterLoadEncoded
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.util.TestDefaultEnvironmentEdge
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.464 sec - in org.apache.hadoop.hbase.util.TestDefaultEnvironmentEdge
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.util.TestFSHDFSUtils
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.492 sec - in org.apache.hadoop.hbase.util.TestFSHDFSUtils
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.278 sec - in org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildOverlap
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.TestMetaTableAccessorNoCluster
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.933 sec - in org.apache.hadoop.hbase.TestMetaTableAccessorNoCluster
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 42.74 sec - in org.apache.hadoop.hbase.util.hbck.TestOfflineMetaRebuildHole
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.migration.TestUpgradeTo96
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.385 sec - in org.apache.hadoop.hbase.migration.TestUpgradeTo96
Running org.apache.hadoop.hbase.TestNamespace
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.migration.TestNamespaceUpgrade
Running org.apache.hadoop.hbase.TestInfoServers
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.509 sec - in org.apache.hadoop.hbase.TestInfoServers
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Running org.apache.hadoop.hbase.TestMetaTableAccessor
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 44.257 sec - in org.apache.hadoop.hbase.migration.TestNamespaceUpgrade
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=256m; support was removed in 8.0
Tests run: 9, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 69.458 sec - in org.apache.hadoop.hbase.TestNamespace
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 39.634 sec - in org.apache.hadoop.hbase.TestMetaTableAccessor
Tests run: 51, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 674.944 sec - in org.apache.hadoop.hbase.util.TestHBaseFsck

Results :

Tests in error: 
org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles)
  Run 1: TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom:182->testRegionCrossingHFileSplit:195->runTest:223->runTest:232->runTest:285 ? TestTimedOut
  Run 2: TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom:182->testRegionCrossingHFileSplit:195->runTest:223->runTest:232->runTest:285->Object.wait:-2 ? TestTimedOut

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles.testSimpleHFileSplit(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles)
  Run 1: TestLoadIncrementalHFiles.testSimpleHFileSplit:155->runTest:223->runTest:232->runTest:285 ? TestTimedOut
  Run 2: TestLoadIncrementalHFiles.testSimpleHFileSplit:155->runTest:223->runTest:232->runTest:285 ? TestTimedOut

org.apache.hadoop.hbase.master.procedure.TestWALProcedureStoreOnHDFS.testWalRollOnLowReplication(org.apache.hadoop.hbase.master.procedure.TestWALProcedureStoreOnHDFS)
  Run 1: TestWALProcedureStoreOnHDFS.testWalRollOnLowReplication:189 ? Runtime sync abo...
  Run 2: TestWALProcedureStoreOnHDFS.testWalRollOnLowReplication:189 ? Runtime sync abo...
  Run 3: TestWALProcedureStoreOnHDFS.testWalRollOnLowReplication:189 ? Runtime sync abo...

  TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationIgnoresDroppedTables:335->testRegionReplicaReplicationIgnoresDisabledTables:348 ? TestTimedOut
  TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreated:121 ? RetriesExhausted
  TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationPeerIsCreatedForModifyTable:151 ? RetriesExhausted
  TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith10Replicas:259->testRegionReplicaReplication:181 ? TestTimedOut
  TestRegionReplicaReplicationEndpoint.testRegionReplicaReplicationWith2Replicas:249->testRegionReplicaReplication:178 ? RetriesExhausted
  TestRegionReplicaReplicationEndpoint.testRegionReplicaWithoutMemstoreReplication:269 ? RetriesExhausted
Flaked tests: 
org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles.testRegionCrossingHFileSplit(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles)
  Run 1: TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:173->testRegionCrossingHFileSplit:195->runTest:223->runTest:232->runTest:254 ? TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles)
  Run 1: TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom:191->testRegionCrossingHFileSplit:195->runTest:223->runTest:232->runTest:254->Object.wait:-2 ? TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles.testRegionCrossingRowColBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFiles)
  Run 1: TestLoadIncrementalHFiles.testRegionCrossingRowColBloom:142->runTest:209->runTest:223->runTest:232->runTest:285 ? TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingHFileSplitRowBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowBloom:182->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:195->TestLoadIncrementalHFiles.runTest:223->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:285 ? TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testRegionCrossingHFileSplitRowColBloom(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom:191->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:195->TestLoadIncrementalHFiles.runTest:223->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:285 ? TestTimedOut
  Run 2: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom:191->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:195->TestLoadIncrementalHFiles.runTest:223->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:285 ? TestTimedOut
  Run 3: PASS

org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint.testSimpleHFileSplit(org.apache.hadoop.hbase.mapreduce.TestLoadIncrementalHFilesUseSecurityEndPoint)
  Run 1: TestLoadIncrementalHFilesUseSecurityEndPoint>TestLoadIncrementalHFiles.testSimpleHFileSplit:155->TestLoadIncrementalHFiles.runTest:223->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:285->Object.wait:-2 ? TestTimedOut
  Run 2: PASS

org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom(org.apache.hadoop.hbase.mapreduce.TestSecureLoadIncrementalHFiles)
  Run 1: TestSecureLoadIncrementalHFiles>TestLoadIncrementalHFiles.testRegionCrossingHFileSplitRowColBloom:191->TestLoadIncrementalHFiles.testRegionCrossingHFileSplit:195->TestLoadIncrementalHFiles.runTest:223->TestLoadIncrementalHFiles.runTest:232->TestLoadIncrementalHFiles.runTest:254 ? TestTimedOut
  Run 2: PASS


Tests run: 2459, Failures: 0, Errors: 9, Skipped: 21, Flakes: 7

[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache HBase ...................................... SUCCESS [4.126s]
[INFO] Apache HBase - Checkstyle ......................... SUCCESS [0.862s]
[INFO] Apache HBase - Resource Bundle .................... SUCCESS [0.128s]
[INFO] Apache HBase - Annotations ........................ SUCCESS [1.750s]
[INFO] Apache HBase - Protocol ........................... SUCCESS [10.012s]
[INFO] Apache HBase - Common ............................. SUCCESS [1:17.081s]
[INFO] Apache HBase - Procedure .......................... SUCCESS [2:31.256s]
[INFO] Apache HBase - Client ............................. SUCCESS [1:17.351s]
[INFO] Apache HBase - Hadoop Compatibility ............... SUCCESS [8.798s]
[INFO] Apache HBase - Hadoop Two Compatibility ........... SUCCESS [6.014s]
[INFO] Apache HBase - Prefix Tree ........................ SUCCESS [7.660s]
[INFO] Apache HBase - Server ............................. FAILURE [2:16:08.760s]
[INFO] Apache HBase - Testing Util ....................... SKIPPED
[INFO] Apache HBase - Thrift ............................. SKIPPED
[INFO] Apache HBase - Rest ............................... SKIPPED
[INFO] Apache HBase - Shell .............................. SKIPPED
[INFO] Apache HBase - Integration Tests .................. SKIPPED
[INFO] Apache HBase - Examples ........................... SKIPPED
[INFO] Apache HBase - Assembly ........................... SKIPPED
[INFO] Apache HBase - Shaded ............................. SKIPPED
[INFO] Apache HBase - Shaded - Client .................... SKIPPED
[INFO] Apache HBase - Shaded - Server .................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 2:21:55.108s
[INFO] Finished at: Tue Dec 15 08:00:16 UTC 2015
[INFO] Final Memory: 54M/637M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.18.1:test (secondPartTestsExecution) on project hbase-server: ExecutionException: java.lang.RuntimeException: java.lang.RuntimeException: org.apache.maven.surefire.report.ReporterException: When writing xml report stdout/stderr: /tmp/stderr4747874629245446913deferred (No such file or directory) -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hbase-server
Build step 'Invoke top-level Maven targets' marked build as failure
Performing Post build task...
Match found for :.* : True
Logical operation result is TRUE
Running script  : # Post-build task script. TODO: Check this in and have all builds reference check-in.
pwd && ls
# NOTE!!!! The below code has been copied and pasted from ./dev-tools/run-test.sh
# Do not change here without syncing there and vice-versa.
ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l`
if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
 echo "Suspicious java process found - waiting 30s to see if there are just slow to stop"
 sleep 30
 ZOMBIE_TESTS_COUNT=`jps -v | grep surefirebooter | grep -e '-Dhbase.test' | wc -l`
 if [[ $ZOMBIE_TESTS_COUNT != 0 ]] ; then
   echo " {color:red}There appear to be $ZOMBIE_TESTS_COUNT zombie tests{color}, they should have been killed by surefire but survived"
   jps -v | grep surefirebooter | grep -e '-Dhbase.test'
   jps -v | grep surefirebooter | grep -e '-Dhbase.test' | cut -d ' ' -f 1 | xargs -n 1 jstack
   # Exit with error
   exit 1
 else
   echo "We're ok: there is no zombie test, but some tests took some time to stop"
 fi
else
  echo "We're ok: there is no zombie test"
fi
[HBase-1.1-JDK8] $ /bin/bash -xe /tmp/hudson5774909230372051952.sh
+ pwd
/x1/jenkins/jenkins-slave/workspace/HBase-1.1-JDK8
+ ls
CHANGES.txt
LICENSE.txt
NOTICE.txt
README.txt
bin
conf
dev-support
hbase-annotations
hbase-assembly
hbase-checkstyle
hbase-client
hbase-common
hbase-examples
hbase-hadoop-compat
hbase-hadoop2-compat
hbase-it
hbase-native-client
hbase-prefix-tree
hbase-procedure
hbase-protocol
hbase-resource-bundle
hbase-rest
hbase-server
hbase-shaded
hbase-shell
hbase-testing-util
hbase-thrift
pom.xml
src
target
++ jps -v
++ grep surefirebooter
++ grep -e -Dhbase.test
++ wc -l
+ ZOMBIE_TESTS_COUNT=0
+ [[ 0 != 0 ]]
+ echo 'We'\''re ok: there is no zombie test'
We're ok: there is no zombie test
POST BUILD TASK : SUCCESS
END OF POST BUILD TASK : 0
Archiving artifacts
Recording test results
Updating HBASE-14929

Jenkins build is back to normal : HBase-1.1-JDK8 #1708

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/HBase-1.1-JDK8/1708/changes>