You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@phoenix.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2017/09/06 07:48:24 UTC

Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #563

See <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/563/display/redirect?page=changes>

Changes:

[ssa] PHOENIX-3406 CSV BulkLoad MR job incorrectly handle ROW_TIMESTAMP

[ssa] PHOENIX-4068 Atomic Upsert salted table with

[samarth] PHOENIX-4156 Fix flapping MutableIndexFailureIT

[samarth] PHOENIX-4141 Fix flapping TableSnapshotReadsMapReduceIT

[jtaylor] PHOENIX-3953 Clear INDEX_DISABLED_TIMESTAMP and disable index on

[samarth] PHOENIX-4151 Tests extending BaseQueryIT are flapping

[samarth] PHOENIX-4151 Addendum to fix test failure

------------------------------------------
[...truncated 103.31 KB...]
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.759 s - in org.apache.phoenix.end2end.index.IndexMetadataIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexSplitForwardScanIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 154.364 s - in org.apache.phoenix.end2end.index.MutableIndexSplitForwardScanIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexSplitReverseScanIT
[INFO] Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 324.54 s - in org.apache.phoenix.end2end.index.DropColumnIT
[INFO] Running org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.422 s - in org.apache.phoenix.end2end.index.SaltedIndexIT
[INFO] Running org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.319 s - in org.apache.phoenix.end2end.index.ViewIndexIT
[INFO] Running org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 158.229 s - in org.apache.phoenix.end2end.index.MutableIndexSplitReverseScanIT
[INFO] Running org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 74.35 s - in org.apache.phoenix.end2end.index.txn.MutableRollbackIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Tests run: 67, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 461.423 s - in org.apache.phoenix.end2end.index.IndexExpressionIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.661 s - in org.apache.phoenix.end2end.index.txn.RollbackIT
[INFO] Running org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Tests run: 102, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1,125.811 s - in org.apache.phoenix.end2end.SortMergeJoinIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.185 s - in org.apache.phoenix.iterate.PhoenixQueryTimeoutIT
[INFO] Running org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 s - in org.apache.phoenix.replication.SystemCatalogWALEntryFilterIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.257 s - in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
[INFO] Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.294 s - in org.apache.phoenix.end2end.salted.SaltedTableVarLengthRowKeyIT
[INFO] Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.306 s - in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
[INFO] Running org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.686 s - in org.apache.phoenix.rpc.UpdateCacheIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.365 s - in org.apache.phoenix.tx.FlappingTransactionIT
[INFO] Running org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Running org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.302 s - in org.apache.phoenix.iterate.RoundRobinResultIteratorIT
[INFO] Running org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 88.527 s - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
[INFO] Running org.apache.phoenix.util.IndexScrutinyIT
[INFO] Tests run: 64, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 497.867 s - in org.apache.phoenix.end2end.index.MutableIndexIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.017 s - in org.apache.phoenix.tx.TransactionIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.306 s - in org.apache.phoenix.util.IndexScrutinyIT
[WARNING] Tests run: 52, Failures: 0, Errors: 0, Skipped: 4, Time elapsed: 228.441 s - in org.apache.phoenix.tx.ParameterizedTransactionIT
[INFO] Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 290.962 s - in org.apache.phoenix.tx.TxCheckpointIT
[INFO] Tests run: 304, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2,068.385 s - in org.apache.phoenix.end2end.index.IndexIT
[INFO] 
[INFO] Results:
[INFO] 
[ERROR] Errors: 
[ERROR]   MutableQueryIT.<init>:66->BaseQueryIT.<init>:120->BaseTest.initATableValues:1060->BaseTest.ensureTableCreated:712->BaseTest.createTestTable:756->BaseTest.createTestTable:792 ยป PhoenixIO
[INFO] 
[ERROR] Tests run: 3052, Failures: 0, Errors: 1, Skipped: 5
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (ClientManagedTimeTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.phoenix.end2end.DistinctCountIT
[INFO] Running org.apache.phoenix.end2end.DerivedTableIT
[INFO] Running org.apache.phoenix.end2end.CustomEntityDataIT
[INFO] Running org.apache.phoenix.end2end.ExtendedQueryExecIT
[INFO] Running org.apache.phoenix.end2end.CreateSchemaIT
[INFO] Running org.apache.phoenix.end2end.DropSchemaIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.539 s - in org.apache.phoenix.end2end.CreateSchemaIT
[INFO] Running org.apache.phoenix.end2end.FunkyNamesIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.84 s - in org.apache.phoenix.end2end.FunkyNamesIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.277 s - in org.apache.phoenix.end2end.CustomEntityDataIT
[INFO] Running org.apache.phoenix.end2end.ProductMetricsIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.581 s - in org.apache.phoenix.end2end.ExtendedQueryExecIT
[INFO] Running org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.413 s - in org.apache.phoenix.end2end.DropSchemaIT
[INFO] Running org.apache.phoenix.end2end.ReadIsolationLevelIT
[INFO] Running org.apache.phoenix.end2end.NativeHBaseTypesIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.391 s - in org.apache.phoenix.end2end.DerivedTableIT
[INFO] Running org.apache.phoenix.end2end.RowValueConstructorIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.317 s - in org.apache.phoenix.end2end.ReadIsolationLevelIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.841 s - in org.apache.phoenix.end2end.DistinctCountIT
[INFO] Running org.apache.phoenix.end2end.SequenceBulkAllocationIT
[INFO] Running org.apache.phoenix.end2end.SequenceIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.391 s - in org.apache.phoenix.end2end.NativeHBaseTypesIT
[INFO] Running org.apache.phoenix.end2end.ToNumberFunctionIT
[INFO] Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.665 s - in org.apache.phoenix.end2end.ToNumberFunctionIT
[INFO] Running org.apache.phoenix.end2end.TopNIT
[INFO] Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.877 s - in org.apache.phoenix.end2end.ProductMetricsIT
[INFO] Running org.apache.phoenix.end2end.TruncateFunctionIT
[INFO] Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.514 s - in org.apache.phoenix.end2end.TopNIT
[INFO] Running org.apache.phoenix.end2end.UpsertValuesIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.836 s - in org.apache.phoenix.end2end.TruncateFunctionIT
[INFO] Running org.apache.phoenix.end2end.VariableLengthPKIT
[INFO] Tests run: 56, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.147 s - in org.apache.phoenix.end2end.SequenceBulkAllocationIT
[INFO] Running org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.727 s - in org.apache.phoenix.end2end.salted.SaltedTableIT
[INFO] Running org.apache.phoenix.rpc.UpdateCacheWithScnIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.221 s - in org.apache.phoenix.rpc.UpdateCacheWithScnIT
[INFO] Tests run: 55, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.515 s - in org.apache.phoenix.end2end.SequenceIT
[INFO] Tests run: 50, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 46.129 s - in org.apache.phoenix.end2end.VariableLengthPKIT
[INFO] Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 121.726 s - in org.apache.phoenix.end2end.RowValueConstructorIT
[INFO] Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 143.052 s - in org.apache.phoenix.end2end.QueryDatabaseMetaDataIT
[INFO] Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 125.232 s - in org.apache.phoenix.end2end.UpsertValuesIT
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 394, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (HBaseManagedTimeTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] 
[INFO] Results:
[INFO] 
[INFO] Tests run: 0, Failures: 0, Errors: 0, Skipped: 0
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:integration-test (NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] 
[INFO] -------------------------------------------------------
[INFO]  T E S T S
[INFO] -------------------------------------------------------
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Running org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Running org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.438 s - in org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.397 s - in org.apache.hadoop.hbase.regionserver.wal.WALRecoveryRegionPostOpenIT
[INFO] Running org.apache.phoenix.end2end.ArrayIT
[INFO] Running org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.823 s - in org.apache.phoenix.end2end.ConnectionUtilIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.478 s - in org.apache.phoenix.end2end.ContextClassloaderIT
[INFO] Running org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.535 s - in org.apache.phoenix.end2end.CountDistinctCompressionIT
[INFO] Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Running org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.035 s - in org.apache.phoenix.end2end.IndexToolForPartialBuildIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.029 s - in org.apache.phoenix.end2end.IndexToolForPartialBuildWithNamespaceEnabledIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.518 s - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Running org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.224 s - in org.apache.phoenix.end2end.QueryWithLimitIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.172 s - in org.apache.phoenix.end2end.QueryTimeoutIT
[INFO] Running org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.478 s - in org.apache.phoenix.end2end.RebuildIndexConnectionPropsIT
[INFO] Running org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 136.464 s - in org.apache.phoenix.end2end.FlappingLocalIndexIT
[INFO] Running org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Tests run: 80, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 191.155 s - in org.apache.phoenix.end2end.ArrayIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.42 s - in org.apache.phoenix.end2end.RenewLeaseIT
[INFO] Running org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.459 s - in org.apache.phoenix.end2end.SpillableGroupByIT
[INFO] Running org.apache.phoenix.end2end.StatsCollectorIT
[INFO] Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 73.409 s - in org.apache.phoenix.end2end.RegexBulkLoadToolIT
[INFO] Running org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[INFO] Running org.apache.phoenix.end2end.TableSnapshotReadsMapReduceIT
[INFO] Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.637 s - in org.apache.phoenix.end2end.TableSnapshotReadsMapReduceIT
[INFO] Running org.apache.phoenix.end2end.UpdateCacheAcrossDifferentClientsIT
[INFO] Running org.apache.phoenix.end2end.UserDefinedFunctionsIT
[INFO] Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.73 s - in org.apache.phoenix.end2end.UpdateCacheAcrossDifferentClientsIT
[INFO] Running org.apache.phoenix.end2end.index.ImmutableIndexIT
[INFO] Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 57.302 s - in org.apache.phoenix.end2end.UserDefinedFunctionsIT
[INFO] Running org.apache.phoenix.end2end.index.LocalIndexIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
[WARNING] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.002 s - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
[INFO] Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.566 s - in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
[WARNING] Tests run: 24, Failures: 0, Errors: 0, Skipped: 16, Time elapsed: 116.516 s - in org.apache.phoenix.end2end.index.ImmutableIndexIT
[INFO] Running org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] Running org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[INFO] Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.328 s - in org.apache.phoenix.end2end.index.txn.TxWriteFailureIT
[INFO] Tests run: 140, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 501.094 s - in org.apache.phoenix.end2end.IndexExtendedIT
[INFO] Running org.apache.phoenix.execute.UpsertSelectOverlappingBatchesIT
[INFO] Running org.apache.phoenix.execute.PartialCommitIT
[INFO] Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.824 s - in org.apache.phoenix.execute.PartialCommitIT
[WARNING] Tests run: 132, Failures: 0, Errors: 0, Skipped: 24, Time elapsed: 383.481 s - in org.apache.phoenix.end2end.StatsCollectorIT
[WARNING] Tests run: 132, Failures: 0, Errors: 0, Skipped: 24, Time elapsed: 371.598 s - in org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
[INFO] Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.692 s - in org.apache.phoenix.execute.UpsertSelectOverlappingBatchesIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.448 s - in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
[INFO] Running org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
[INFO] Running org.apache.phoenix.hbase.index.covered.FailWithoutRetriesIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.371 s - in org.apache.phoenix.hbase.index.covered.FailWithoutRetriesIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.504 s - in org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
[INFO] Running org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Running org.apache.phoenix.iterate.ScannerLeaseRenewalIT
[INFO] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.233 s - in org.apache.phoenix.iterate.RoundRobinResultIteratorWithStatsIT
[INFO] Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 333.472 s - in org.apache.phoenix.end2end.index.LocalIndexIT
[INFO] Running org.apache.phoenix.rpc.PhoenixClientRpcIT
[INFO] Running org.apache.phoenix.monitoring.PhoenixMetricsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.117 s - in org.apache.phoenix.rpc.PhoenixClientRpcIT
[INFO] Running org.apache.phoenix.rpc.PhoenixServerRpcIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.13 s - in org.apache.phoenix.rpc.PhoenixServerRpcIT
[INFO] Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 62.71 s - in org.apache.phoenix.monitoring.PhoenixMetricsIT
[INFO] Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 162.971 s - in org.apache.phoenix.iterate.ScannerLeaseRenewalIT
[INFO] Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 546.787 s - in org.apache.phoenix.end2end.index.PartialIndexRebuilderIT
[INFO] 
[INFO] Results:
[INFO] 
[WARNING] Tests run: 701, Failures: 0, Errors: 0, Skipped: 65
[INFO] 
[INFO] 
[INFO] --- maven-failsafe-plugin:2.20:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  2.127 s]
[INFO] Phoenix Core ....................................... FAILURE [  01:46 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Kafka .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:46 h
[INFO] Finished at: 2017-09-06T07:38:50Z
[INFO] Final Memory: 62M/1084M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.20:verify (ParallelStatsEnabledTest) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] Please refer to dump files (if any exist) [date]-jvmRun[N].dump, [date].dumpstream and [date]-jvmRun[N].dumpstream.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
[Fast Archiver] Compressed 1.14 GB of artifacts by 30.5% relative to #550
Recording test results

Jenkins build is back to normal : Phoenix-4.x-HBase-1.1 #565

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/565/display/redirect?page=changes>


Build failed in Jenkins: Phoenix-4.x-HBase-1.1 #564

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/display/redirect?page=changes>

Changes:

[jtaylor] PHOENIX-4162 Disallow transition from DISABLE to INACTIVE when

------------------------------------------
[...truncated 307.90 KB...]
[INFO] --- maven-surefire-plugin:2.20:test (default-test) @ phoenix-spark ---
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ phoenix-spark ---
[INFO] Tests are skipped.
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-spark ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-sources.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-spark ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-tests.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-spark ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (empty-javadoc-jar) @ phoenix-spark ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/564/artifact/phoenix-spark/target/phoenix-spark-4.12.0-HBase-1.1-SNAPSHOT-javadoc.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-spark ---
[INFO] 
[INFO] --- scalatest-maven-plugin:1.0:test (integration-test) @ phoenix-spark ---
WARNING: -c has been deprecated and will be reused for a different (but still very cool) purpose in ScalaTest 2.0. Please change all uses of -c to -P.
Discovery starting.
Discovery completed in 725 milliseconds.
Run starting. Expected test count is: 35
PhoenixSparkIT:
AbstractPhoenixSparkIT:
PhoenixSparkITTenantSpecific:
Formatting using clusterid: testClusterID
Formatting using clusterid: testClusterID
2    [ScalaTest-3] ERROR org.apache.hadoop.hdfs.MiniDFSCluster  - IOE creating namenodes. Permissions dump:
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/data'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/data>
	permissions: ----
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark>
	permissions: drwx
path '<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/'>: 
	absolute:<https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/>
	permissions: drwx
path '/home/jenkins/jenkins-slave/workspace': 
	absolute:/home/jenkins/jenkins-slave/workspace
	permissions: drwx
path '/home/jenkins/jenkins-slave': 
	absolute:/home/jenkins/jenkins-slave
	permissions: drwx
path '/home/jenkins': 
	absolute:/home/jenkins
	permissions: drwx
path '/home': 
	absolute:/home
	permissions: dr-x
path '/': 
	absolute:/
	permissions: dr-x

java.io.IOException: Cannot create directory <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
	at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
	at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
	at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
	at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
	at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
	at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
	at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
	at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniDFSCluster(HBaseTestingUtility.java:574)
	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:968)
	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:849)
	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:831)
	at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniCluster(HBaseTestingUtility.java:818)
	at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:558)
	at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:458)
	at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:440)
	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:532)
	at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:526)
	at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
	at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
	at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
	at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)
	at org.apache.phoenix.spark.AbstractPhoenixSparkIT.run(AbstractPhoenixSparkIT.scala:44)
	at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:55)
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
	at java.util.concurrent.FutureTask.run(FutureTask.java:262)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:745)
Exception encountered when invoking run on a nested suite - java.io.IOException: Cannot create directory <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current> *** ABORTED ***
  java.lang.RuntimeException: java.io.IOException: Cannot create directory <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
  at org.apache.phoenix.query.BaseTest.initMiniCluster(BaseTest.java:561)
  at org.apache.phoenix.query.BaseTest.setUpTestCluster(BaseTest.java:458)
  at org.apache.phoenix.query.BaseTest.checkClusterInitialized(BaseTest.java:440)
  at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:532)
  at org.apache.phoenix.query.BaseTest.setUpTestDriver(BaseTest.java:526)
  at org.apache.phoenix.end2end.BaseHBaseManagedTimeIT.doSetup(BaseHBaseManagedTimeIT.java:57)
  at org.apache.phoenix.spark.PhoenixSparkITHelper$.doSetup(AbstractPhoenixSparkIT.scala:33)
  at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:88)
  at org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)
  at org.apache.phoenix.spark.AbstractPhoenixSparkIT.beforeAll(AbstractPhoenixSparkIT.scala:44)
  ...
  Cause: java.io.IOException: Cannot create directory <https://builds.apache.org/job/Phoenix-4.x-HBase-1.1/ws/phoenix-spark/target/test-data/69683e42-6703-44be-9d85-ab9e4b673b5f/dfscluster_89a91e0b-5d41-4fb0-9ad0-0a6ed1de255b/dfs/name1/current>
  at org.apache.hadoop.hdfs.server.common.Storage$StorageDirectory.clearDirectory(Storage.java:337)
  at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:548)
  at org.apache.hadoop.hdfs.server.namenode.NNStorage.format(NNStorage.java:569)
  at org.apache.hadoop.hdfs.server.namenode.FSImage.format(FSImage.java:161)
  at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:991)
  at org.apache.hadoop.hdfs.server.namenode.NameNode.format(NameNode.java:342)
  at org.apache.hadoop.hdfs.DFSTestUtil.formatNameNode(DFSTestUtil.java:176)
  at org.apache.hadoop.hdfs.MiniDFSCluster.createNameNodesAndSetConf(MiniDFSCluster.java:973)
  at org.apache.hadoop.hdfs.MiniDFSCluster.initMiniDFSCluster(MiniDFSCluster.java:811)
  at org.apache.hadoop.hdfs.MiniDFSCluster.<init>(MiniDFSCluster.java:742)
  ...
10983 [RpcServer.reader=1,bindAddress=qnode3.quenda.co,port=41577] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.1.1 port: 42687 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
11960 [RpcServer.reader=1,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37372 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
12601 [RpcServer.reader=2,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37376 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
14098 [RpcServer.reader=3,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37388 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
15001 [RpcServer.reader=4,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37400 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
15045 [RpcServer.reader=2,bindAddress=qnode3.quenda.co,port=41577] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 35474 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
67190 [ScalaTest-4] INFO  org.spark_project.jetty.util.log  - Logging initialized @73926ms
67417 [ScalaTest-4] INFO  org.spark_project.jetty.server.Server  - jetty-9.2.z-SNAPSHOT
67460 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@65440139{/jobs,null,AVAILABLE}
67461 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@53ac791f{/jobs/json,null,AVAILABLE}
67462 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@60440d23{/jobs/job,null,AVAILABLE}
67463 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@13baa635{/jobs/job/json,null,AVAILABLE}
67464 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@42f5ebb1{/stages,null,AVAILABLE}
67465 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@1f44d3cf{/stages/json,null,AVAILABLE}
67466 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@5fadcb12{/stages/stage,null,AVAILABLE}
67467 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@aef3bd1{/stages/stage/json,null,AVAILABLE}
67468 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@60401bf2{/stages/pool,null,AVAILABLE}
67469 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@10eade3c{/stages/pool/json,null,AVAILABLE}
67469 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@2ba43dd1{/storage,null,AVAILABLE}
67470 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@2bd67ab9{/storage/json,null,AVAILABLE}
67471 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@e131e2b{/storage/rdd,null,AVAILABLE}
67472 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@921a045{/storage/rdd/json,null,AVAILABLE}
67472 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@7ea116b1{/environment,null,AVAILABLE}
67473 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@1d8ff6a{/environment/json,null,AVAILABLE}
67474 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@4d688918{/executors,null,AVAILABLE}
67474 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@a00ae5c{/executors/json,null,AVAILABLE}
67475 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@32b71f25{/executors/threadDump,null,AVAILABLE}
67476 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@145dd026{/executors/threadDump/json,null,AVAILABLE}
67498 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@1b0c793c{/static,null,AVAILABLE}
67505 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@4feb6003{/,null,AVAILABLE}
67508 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@65ec8df2{/api,null,AVAILABLE}
67510 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@d5b3f25{/stages/stage/kill,null,AVAILABLE}
67523 [ScalaTest-4] INFO  org.spark_project.jetty.server.ServerConnector  - Started ServerConnector@18fa2b54{HTTP/1.1}{0.0.0.0:4040}
67536 [ScalaTest-4] INFO  org.spark_project.jetty.server.Server  - Started @74274ms
68348 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@440e5527{/metrics/json,null,AVAILABLE}
70199 [RpcServer.reader=5,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37596 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
70257 [RpcServer.reader=3,bindAddress=qnode3.quenda.co,port=41577] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 35670 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
75813 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@797a25a0{/SQL,null,AVAILABLE}
75815 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@c3e407b{/SQL/json,null,AVAILABLE}
75817 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@41c57f19{/SQL/execution,null,AVAILABLE}
75818 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@14514e6{/SQL/execution/json,null,AVAILABLE}
75822 [ScalaTest-4-running-PhoenixSparkITTenantSpecific] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Started o.s.j.s.ServletContextHandler@306e8128{/static/sql,null,AVAILABLE}
79140 [RpcServer.reader=6,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37608 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
79182 [RpcServer.reader=4,bindAddress=qnode3.quenda.co,port=41577] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 35682 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as DataFrame
80869 [RpcServer.reader=7,bindAddress=qnode3.quenda.co,port=45045] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 37616 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
80897 [RpcServer.reader=5,bindAddress=qnode3.quenda.co,port=41577] INFO  SecurityLogger.org.apache.hadoop.hbase.Server  - Connection from 127.0.0.1 port: 35690 with version info: version: "1.1.9" url: "git://diocles.local/Volumes/hbase-1.1.9/hbase" revision: "0d1feabed5295495ed2257d31fab9e6553e8a9d7" user: "ndimiduk" date: "Mon Feb 20 22:35:28 PST 2017" src_checksum: "b68339108ddccd1dfc44a76646588a58"
- Can read from tenant-specific table as RDD
- Can write a DataFrame using 'DataFrame.saveToPhoenix' to tenant-specific view
- Can write a DataFrame using 'DataFrame.write' to tenant-specific view
- Can write an RDD to Phoenix tenant-specific view
85249 [ScalaTest-4] INFO  org.spark_project.jetty.server.ServerConnector  - Stopped ServerConnector@18fa2b54{HTTP/1.1}{0.0.0.0:4040}
85253 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@d5b3f25{/stages/stage/kill,null,UNAVAILABLE}
85254 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@65ec8df2{/api,null,UNAVAILABLE}
85255 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@4feb6003{/,null,UNAVAILABLE}
85255 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@1b0c793c{/static,null,UNAVAILABLE}
85256 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@145dd026{/executors/threadDump/json,null,UNAVAILABLE}
85256 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@32b71f25{/executors/threadDump,null,UNAVAILABLE}
85257 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@a00ae5c{/executors/json,null,UNAVAILABLE}
85257 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@4d688918{/executors,null,UNAVAILABLE}
85258 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@1d8ff6a{/environment/json,null,UNAVAILABLE}
85259 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@7ea116b1{/environment,null,UNAVAILABLE}
85259 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@921a045{/storage/rdd/json,null,UNAVAILABLE}
85260 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@e131e2b{/storage/rdd,null,UNAVAILABLE}
85260 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@2bd67ab9{/storage/json,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@2ba43dd1{/storage,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@10eade3c{/stages/pool/json,null,UNAVAILABLE}
85261 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@60401bf2{/stages/pool,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@aef3bd1{/stages/stage/json,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@5fadcb12{/stages/stage,null,UNAVAILABLE}
85262 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@1f44d3cf{/stages/json,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@42f5ebb1{/stages,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@13baa635{/jobs/job/json,null,UNAVAILABLE}
85263 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@60440d23{/jobs/job,null,UNAVAILABLE}
85264 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@53ac791f{/jobs/json,null,UNAVAILABLE}
85264 [ScalaTest-4] INFO  org.spark_project.jetty.server.handler.ContextHandler  - Stopped o.s.j.s.ServletContextHandler@65440139{/jobs,null,UNAVAILABLE}
Run completed in 2 minutes, 50 seconds.
Total number of tests run: 5
Suites: completed 3, aborted 1
Tests: succeeded 5, failed 0, canceled 0, ignored 0, pending 0
*** 1 SUITE ABORTED ***
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  4.307 s]
[INFO] Phoenix Core ....................................... SUCCESS [  01:45 h]
[INFO] Phoenix - Flume .................................... SUCCESS [01:54 min]
[INFO] Phoenix - Kafka .................................... SUCCESS [02:27 min]
[INFO] Phoenix - Pig ...................................... SUCCESS [05:49 min]
[INFO] Phoenix Query Server Client ........................ SUCCESS [ 29.307 s]
[INFO] Phoenix Query Server ............................... SUCCESS [02:35 min]
[INFO] Phoenix - Pherf .................................... SUCCESS [03:04 min]
[INFO] Phoenix - Spark .................................... FAILURE [04:06 min]
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] Phoenix Load Balancer .............................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:05 h
[INFO] Finished at: 2017-09-06T09:57:47Z
[INFO] Final Memory: 121M/1274M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (integration-test) on project phoenix-spark: There are test failures -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-spark
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results