You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@phoenix.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2014/09/27 07:32:25 UTC

Build failed in Jenkins: Phoenix | 4.0 | Hadoop2 #173

See <https://builds.apache.org/job/Phoenix-4.0-hadoop2/173/changes>

Changes:

[jtaylor] Increasing memory a little bit

------------------------------------------
[...truncated 574 lines...]
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.junit.Assert.assertEquals(Assert.java:542)
	at org.apache.phoenix.query.BaseTest.assertValuesEqualsResultSet(BaseTest.java:1328)
	at org.apache.phoenix.end2end.BaseQueryIT.assertValueEqualsResultSet(BaseQueryIT.java:126)
	at org.apache.phoenix.end2end.QueryIT.testFullyQualifiedRVCInList(QueryIT.java:492)

testFullyQualifiedRVCInList[](org.apache.phoenix.end2end.QueryIT)  Time elapsed: 0.4 sec  <<< FAILURE!
java.lang.AssertionError: expected:<0> but was:<2>
	at org.junit.Assert.fail(Assert.java:88)
	at org.junit.Assert.failNotEquals(Assert.java:743)
	at org.junit.Assert.assertEquals(Assert.java:118)
	at org.junit.Assert.assertEquals(Assert.java:555)
	at org.junit.Assert.assertEquals(Assert.java:542)
	at org.apache.phoenix.query.BaseTest.assertValuesEqualsResultSet(BaseTest.java:1328)
	at org.apache.phoenix.end2end.BaseQueryIT.assertValueEqualsResultSet(BaseQueryIT.java:126)
	at org.apache.phoenix.end2end.QueryIT.testFullyQualifiedRVCInList(QueryIT.java:492)

Running org.apache.phoenix.end2end.ReadIsolationLevelIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.04 sec - in org.apache.phoenix.end2end.ReadIsolationLevelIT
Running org.apache.phoenix.end2end.CreateTableIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.586 sec - in org.apache.phoenix.end2end.UpsertSelectIT
Running org.apache.phoenix.end2end.CompareDecimalToLongIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.191 sec - in org.apache.phoenix.end2end.CompareDecimalToLongIT
Running org.apache.phoenix.end2end.OrderByIT
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.518 sec - in org.apache.phoenix.end2end.ProductMetricsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.831 sec - in org.apache.phoenix.end2end.OrderByIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.013 sec - in org.apache.phoenix.end2end.CreateTableIT
Tests run: 63, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.084 sec - in org.apache.phoenix.end2end.CaseStatementIT
Tests run: 77, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 53.636 sec - in org.apache.phoenix.end2end.NotQueryIT

Results :

Failed tests: 
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>
  QueryIT.testFullyQualifiedRVCInList:492->BaseQueryIT.assertValueEqualsResultSet:126->BaseTest.assertValuesEqualsResultSet:1328 expected:<0> but was:<2>

Tests run: 1270, Failures: 7, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:integration-test (HBaseManagedTimeTests) @ phoenix-core ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] parallel='none', perCoreThreadCount=true, threadCount=0, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
Running org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.trace.PhoenixTraceReaderIT
Running org.apache.phoenix.trace.PhoenixTracingEndToEndIT
Running org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.369 sec - in org.apache.phoenix.trace.PhoenixTableMetricsWriterIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.179 sec - in org.apache.phoenix.trace.PhoenixTraceReaderIT
Running org.apache.phoenix.end2end.SortOrderFIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.013 sec - in org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.end2end.ReverseScanIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.195 sec - in org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.342 sec - in org.apache.phoenix.end2end.ReverseScanIT
Running org.apache.phoenix.end2end.QueryPlanIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.057 sec - in org.apache.phoenix.end2end.SortOrderFIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.391 sec - in org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.SkipScanQueryIT
Running org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.563 sec - in org.apache.phoenix.end2end.UpsertSelectAutoCommitIT
Running org.apache.phoenix.end2end.AlterTableIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.849 sec - in org.apache.phoenix.end2end.QueryPlanIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.706 sec - in org.apache.phoenix.end2end.RoundFloorCeilFunctionsEnd2EndIT
Running org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.index.ViewIndexIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.941 sec - in org.apache.phoenix.end2end.TimezoneOffsetFunctionIT
Running org.apache.phoenix.end2end.index.LocalIndexIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.253 sec - in org.apache.phoenix.end2end.index.ViewIndexIT
Running org.apache.phoenix.end2end.index.IndexMetadataIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.114 sec - in org.apache.phoenix.end2end.SkipScanQueryIT
Running org.apache.phoenix.end2end.index.MutableIndexIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.407 sec - in org.apache.phoenix.end2end.index.IndexMetadataIT
Running org.apache.phoenix.end2end.index.DropViewIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.892 sec - in org.apache.phoenix.end2end.index.DropViewIT
Running org.apache.phoenix.end2end.index.SaltedIndexIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 29.906 sec - in org.apache.phoenix.end2end.AlterTableIT
Running org.apache.phoenix.end2end.index.ImmutableIndexIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.806 sec - in org.apache.phoenix.end2end.index.ImmutableIndexIT
Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.351 sec - in org.apache.phoenix.end2end.index.SaltedIndexIT
Running org.apache.phoenix.end2end.ExecuteStatementsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.749 sec - in org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.ModulusExpressionIT
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.824 sec - in org.apache.phoenix.end2end.CSVCommonsLoaderIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.704 sec - in org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.PhoenixEncodeDecodeIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.961 sec - in org.apache.phoenix.end2end.ModulusExpressionIT
Running org.apache.phoenix.end2end.ViewIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.224 sec - in org.apache.phoenix.end2end.PhoenixEncodeDecodeIT
Running org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.372 sec - in org.apache.phoenix.end2end.QueryExecWithoutSCNIT
Running org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.872 sec - in org.apache.phoenix.end2end.LastValueFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.191 sec - in org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.RegexpSubstrFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.811 sec - in org.apache.phoenix.end2end.RegexpSubstrFunctionIT
Running org.apache.phoenix.end2end.TenantSpecificViewIndexIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.269 sec - in org.apache.phoenix.end2end.ViewIT
Running org.apache.phoenix.end2end.SkipScanAfterManualSplitIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 85.541 sec - in org.apache.phoenix.trace.PhoenixTracingEndToEndIT
Running org.apache.phoenix.end2end.ServerExceptionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.303 sec - in org.apache.phoenix.end2end.ServerExceptionIT
Running org.apache.phoenix.end2end.HashJoinIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.787 sec - in org.apache.phoenix.end2end.TenantSpecificViewIndexIT
Running org.apache.phoenix.end2end.SpillableGroupByIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.922 sec - in org.apache.phoenix.end2end.index.LocalIndexIT
Running org.apache.phoenix.end2end.InListIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.974 sec - in org.apache.phoenix.end2end.SpillableGroupByIT
Running org.apache.phoenix.end2end.UpsertBigValuesIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.56 sec - in org.apache.phoenix.end2end.UpsertBigValuesIT
Running org.apache.phoenix.end2end.CoalesceFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 23.425 sec - in org.apache.phoenix.end2end.SkipScanAfterManualSplitIT
Running org.apache.phoenix.end2end.ReadOnlyIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.664 sec - in org.apache.phoenix.end2end.ReadOnlyIT
Running org.apache.phoenix.end2end.SaltedViewIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.492 sec - in org.apache.phoenix.end2end.index.MutableIndexIT
Running org.apache.phoenix.end2end.ReverseFunctionIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.479 sec - in org.apache.phoenix.end2end.CoalesceFunctionIT
Running org.apache.phoenix.end2end.EncodeFunctionIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.196 sec - in org.apache.phoenix.end2end.ReverseFunctionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.585 sec - in org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.GuidePostsLifeCycleIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.559 sec - in org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.86 sec - in org.apache.phoenix.end2end.GuidePostsLifeCycleIT
Running org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.441 sec - in org.apache.phoenix.end2end.EncodeFunctionIT
Running org.apache.phoenix.end2end.StatsCollectorIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.278 sec - in org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.QueryMoreIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.614 sec - in org.apache.phoenix.end2end.StatsCollectorIT
Running org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.629 sec - in org.apache.phoenix.end2end.salted.SaltedTableUpsertSelectIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.334 sec - in org.apache.phoenix.end2end.TenantSpecificViewIndexSaltedIT
Running org.apache.phoenix.end2end.DeleteIT
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.4 sec - in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.LpadFunctionIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.791 sec - in org.apache.phoenix.end2end.LpadFunctionIT
Running org.apache.phoenix.end2end.StatementHintsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.781 sec - in org.apache.phoenix.end2end.StatementHintsIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 19, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.777 sec - in org.apache.phoenix.end2end.DeleteIT
Running org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.908 sec - in org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.97 sec - in org.apache.phoenix.end2end.RegexpSplitFunctionIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.656 sec - in org.apache.phoenix.end2end.QueryMoreIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.129 sec - in org.apache.phoenix.end2end.InListIT
Tests run: 102, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 97.594 sec - in org.apache.phoenix.end2end.HashJoinIT

Results :

Tests run: 497, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:integration-test (NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] parallel='none', perCoreThreadCount=true, threadCount=0, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Running org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Running org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Running org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.985 sec - in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.924 sec - in org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.664 sec - in org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Running org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.117 sec - in org.apache.phoenix.end2end.ContextClassloaderIT
Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.876 sec - in org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.942 sec - in org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 100.82 sec - in org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 143.951 sec - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 173.072 sec - in org.apache.phoenix.mapreduce.CsvBulkLoadToolIT

Results :

Tests run: 46, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:verify (ClientManagedTimeTests) @ phoenix-core ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix .................................... SUCCESS [1.866s]
[INFO] Phoenix Hadoop Compatibility ...................... SUCCESS [1.532s]
[INFO] Phoenix Hadoop2 Compatibility ..................... SUCCESS [2.921s]
[INFO] Phoenix Core ...................................... FAILURE [11:29.991s]
[INFO] Phoenix - Flume ................................... SKIPPED
[INFO] Phoenix - Pig ..................................... SKIPPED
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 11:36.728s
[INFO] Finished at: Sat Sep 27 05:29:51 UTC 2014
[INFO] Final Memory: 45M/342M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.17:verify (ClientManagedTimeTests) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Sending artifact delta relative to Phoenix | 4.0 | Hadoop2 #172
Archived 701 artifacts
Archive block size is 32768
Received 4714 blocks and 273119930 bytes
Compression is 36.1%
Took 1 min 44 sec
Recording test results

Jenkins build is back to normal : Phoenix | 4.0 | Hadoop2 #175

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-4.0-hadoop2/175/changes>


Build failed in Jenkins: Phoenix | 4.0 | Hadoop2 #174

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-4.0-hadoop2/174/changes>

Changes:

[jtaylor] PHOENIX-1298 Queries on fixed width type columns that have an index declared on them don't use that index

------------------------------------------
[...truncated 1479 lines...]
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
	at org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:365)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:400)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:161)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:140)
	... 8 more

	at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:262)
	at java.util.concurrent.FutureTask.get(FutureTask.java:119)
	at org.apache.phoenix.iterate.ParallelIterators.getIterators(ParallelIterators.java:281)
	at org.apache.phoenix.iterate.MergeSortResultIterator.getIterators(MergeSortResultIterator.java:48)
	at org.apache.phoenix.iterate.MergeSortResultIterator.minIterator(MergeSortResultIterator.java:63)
	at org.apache.phoenix.iterate.MergeSortResultIterator.next(MergeSortResultIterator.java:90)
	at org.apache.phoenix.iterate.GroupedAggregatingResultIterator.next(GroupedAggregatingResultIterator.java:68)
	at org.apache.phoenix.iterate.OrderedResultIterator.getResultIterator(OrderedResultIterator.java:215)
	at org.apache.phoenix.iterate.OrderedResultIterator.next(OrderedResultIterator.java:169)
	at org.apache.phoenix.iterate.OrderedAggregatingResultIterator.next(OrderedAggregatingResultIterator.java:50)
	at org.apache.phoenix.iterate.DelegateResultIterator.next(DelegateResultIterator.java:40)
	at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:732)
	at org.apache.phoenix.end2end.HashJoinIT.testSubJoin(HashJoinIT.java:3296)
Caused by: org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.DoNotRetryIOException: Join.OrderTable,,1411806340009.4733f67f56ac3c97ce39633f1eb8057e.: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:77)
	at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:45)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:158)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
	at org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:365)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:400)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:161)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:140)
	... 8 more

	at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:101)
	at org.apache.phoenix.iterate.TableResultIterator.<init>(TableResultIterator.java:57)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:363)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:358)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.DoNotRetryIOException: org.apache.hadoop.hbase.DoNotRetryIOException: Join.OrderTable,,1411806340009.4733f67f56ac3c97ce39633f1eb8057e.: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:77)
	at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:45)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:158)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
	at org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:365)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:400)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:161)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:140)
	... 8 more

	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
	at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
	at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:285)
	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:316)
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:164)
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90)
	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:282)
	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:187)
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:182)
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:109)
	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:738)
	at org.apache.phoenix.iterate.TableResultIterator.<init>(TableResultIterator.java:54)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:363)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:358)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.DoNotRetryIOException: Join.OrderTable,,1411806340009.4733f67f56ac3c97ce39633f1eb8057e.: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:77)
	at org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:45)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:158)
	at org.apache.hadoop.hbase.regionserver.RegionCoprocessorHost.postScannerOpen(RegionCoprocessorHost.java:1845)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.scan(HRegionServer.java:3092)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:29497)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2027)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:98)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:114)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:94)
	at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.phoenix.memory.InsufficientMemoryException: Requested memory of 5760 bytes could not be allocated from remaining memory of 35568 bytes from global pool of 40000 bytes after waiting for 0ms.
	at org.apache.phoenix.memory.GlobalMemoryManager.allocateBytes(GlobalMemoryManager.java:81)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:100)
	at org.apache.phoenix.memory.GlobalMemoryManager.allocate(GlobalMemoryManager.java:106)
	at org.apache.phoenix.cache.aggcache.SpillableGroupByCache.<init>(SpillableGroupByCache.java:150)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver$GroupByCacheFactory.newCache(GroupedAggregateRegionObserver.java:365)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.scanUnordered(GroupedAggregateRegionObserver.java:400)
	at org.apache.phoenix.coprocessor.GroupedAggregateRegionObserver.doPostScannerOpen(GroupedAggregateRegionObserver.java:161)
	at org.apache.phoenix.coprocessor.BaseScannerRegionObserver.postScannerOpen(BaseScannerRegionObserver.java:140)
	... 8 more

	at org.apache.hadoop.hbase.ipc.RpcClient.call(RpcClient.java:1452)
	at org.apache.hadoop.hbase.ipc.RpcClient.callBlockingMethod(RpcClient.java:1656)
	at org.apache.hadoop.hbase.ipc.RpcClient$BlockingRpcChannelImplementation.callBlockingMethod(RpcClient.java:1714)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:29900)
	at org.apache.hadoop.hbase.client.ScannerCallable.openScanner(ScannerCallable.java:308)
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:164)
	at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:59)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:114)
	at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:90)
	at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:282)
	at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:187)
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:182)
	at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:109)
	at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:738)
	at org.apache.phoenix.iterate.TableResultIterator.<init>(TableResultIterator.java:54)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:363)
	at org.apache.phoenix.iterate.ParallelIterators$3.call(ParallelIterators.java:358)
	at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
	at java.util.concurrent.FutureTask.run(FutureTask.java:166)
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
	at java.lang.Thread.run(Thread.java:724)


Results :

Tests in error: 
  LocalIndexIT.testLocalIndexScanJoinColumnsFromDataTable:439 ? PhoenixIO org.ap...
  HashJoinIT.testJoinWithSubqueryAndAggregation:3503 ? PhoenixIO org.apache.phoe...
  HashJoinIT.testLeftJoinWithAggregation:1990 ? PhoenixIO org.apache.phoenix.exc...
  HashJoinIT.testUpsertWithJoin:2953 ? PhoenixIO org.apache.phoenix.exception.Ph...
  HashJoinIT.testRightJoinWithAggregation:2052 ? PhoenixIO org.apache.phoenix.ex...
  HashJoinIT.testSubJoin:3296 ? PhoenixIO org.apache.phoenix.exception.PhoenixIO...

Tests run: 497, Failures: 0, Errors: 6, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:integration-test (NeedTheirOwnClusterTests) @ phoenix-core ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] parallel='none', perCoreThreadCount=true, threadCount=0, useUnlimitedThreads=false, threadCountSuites=0, threadCountClasses=0, threadCountMethods=0, parallelOptimized=true

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Running org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Running org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Running org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.465 sec - in org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT
Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.015 sec - in org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.699 sec - in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.618 sec - in org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.594 sec - in org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 48.414 sec - in org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Running org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 110.974 sec - in org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 156.707 sec - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 175.052 sec - in org.apache.phoenix.mapreduce.CsvBulkLoadToolIT

Results :

Tests run: 46, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.17:verify (ClientManagedTimeTests) @ phoenix-core ---
[INFO] Failsafe report directory: <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports>
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix .................................... SUCCESS [2.272s]
[INFO] Phoenix Hadoop Compatibility ...................... SUCCESS [1.956s]
[INFO] Phoenix Hadoop2 Compatibility ..................... SUCCESS [3.847s]
[INFO] Phoenix Core ...................................... FAILURE [13:26.588s]
[INFO] Phoenix - Flume ................................... SKIPPED
[INFO] Phoenix - Pig ..................................... SKIPPED
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:35.152s
[INFO] Finished at: Sat Sep 27 08:31:41 UTC 2014
[INFO] Final Memory: 45M/276M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.17:verify (ClientManagedTimeTests) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-4.0-hadoop2/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Sending artifact delta relative to Phoenix | 4.0 | Hadoop2 #172
Archived 711 artifacts
Archive block size is 32768
Received 5142 blocks and 274339868 bytes
Compression is 38.0%
Took 1 min 50 sec
Recording test results
Updating PHOENIX-1298