You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@phoenix.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2016/11/04 22:33:19 UTC

Build failed in Jenkins: Phoenix-encode-columns #19

See <https://builds.apache.org/job/Phoenix-encode-columns/19/>

------------------------------------------
Started by an SCM change
[EnvInject] - Loading node environment variables.
Building remotely on ubuntu-1 (Ubuntu ubuntu1 yahoo-not-h2 ubuntu docker) in workspace <https://builds.apache.org/job/Phoenix-encode-columns/ws/>
Cloning the remote Git repository
Cloning repository https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git init <https://builds.apache.org/job/Phoenix-encode-columns/ws/> # timeout=10
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git --version # timeout=10
 > git -c core.askpass=true fetch --tags --progress https://git-wip-us.apache.org/repos/asf/phoenix.git +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10
 > git config remote.origin.url https://git-wip-us.apache.org/repos/asf/phoenix.git # timeout=10
Fetching upstream changes from https://git-wip-us.apache.org/repos/asf/phoenix.git
 > git -c core.askpass=true fetch --tags --progress https://git-wip-us.apache.org/repos/asf/phoenix.git +refs/heads/*:refs/remotes/origin/*
 > git rev-parse origin/encodecolumns2^{commit} # timeout=10
Checking out Revision ede568e9c4e4d35e7f4afe19637c8dd7cf5af23c (origin/encodecolumns2)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f ede568e9c4e4d35e7f4afe19637c8dd7cf5af23c
 > git rev-list 8c31c93ab4808473e91880747ce2295f296006f3 # timeout=10
First time build. Skipping changelog.
No emails were triggered.
[EnvInject] - Executing scripts and injecting environment variables after the SCM step.
[EnvInject] - Injecting as environment variables the properties content 
MAVEN_OPTS=-Xmx3G

[EnvInject] - Variables injected successfully.
[Phoenix-encode-columns] $ /bin/bash -xe /tmp/hudson3999974552990880096.sh
+ echo 'DELETING ~/.m2/repository/org/apache/htrace. See https://issues.apache.org/jira/browse/PHOENIX-1802'
DELETING ~/.m2/repository/org/apache/htrace. See https://issues.apache.org/jira/browse/PHOENIX-1802
+ echo 'CURRENT CONTENT:'
CURRENT CONTENT:
+ ls /home/jenkins/.m2/repository/org/apache/htrace
htrace
htrace-core
[Phoenix-encode-columns] $ /home/jenkins/tools/maven/latest3/bin/mvn -U clean install -Dcheckstyle.skip=true
[INFO] Scanning for projects...
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO] 
[INFO] Apache Phoenix
[INFO] Phoenix Core
[INFO] Phoenix - Flume
[INFO] Phoenix - Pig
[INFO] Phoenix Query Server Client
[INFO] Phoenix Query Server
[INFO] Phoenix - Pherf
[INFO] Phoenix - Spark
[INFO] Phoenix - Hive
[INFO] Phoenix Client
[INFO] Phoenix Server
[INFO] Phoenix Assembly
[INFO] Phoenix - Tracing Web Application
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Phoenix 4.9.0-HBase-0.98
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.13:check (validate) @ phoenix ---
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix ---
[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix ---
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix ---
[WARNING] JAR will be empty - no content was marked for inclusion!
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/19/artifact/target/phoenix-4.9.0-HBase-0.98-tests.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix ---
[INFO] 
[INFO] --- maven-install-plugin:2.5.2:install (default-install) @ phoenix ---
[INFO] Installing <https://builds.apache.org/job/Phoenix-encode-columns/ws/pom.xml> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix/4.9.0-HBase-0.98/phoenix-4.9.0-HBase-0.98.pom
[INFO] Installing <https://builds.apache.org/job/Phoenix-encode-columns/19/artifact/target/phoenix-4.9.0-HBase-0.98-tests.jar> to /home/jenkins/.m2/repository/org/apache/phoenix/phoenix/4.9.0-HBase-0.98/phoenix-4.9.0-HBase-0.98-tests.jar
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Phoenix Core 4.9.0-HBase-0.98
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix-core ---
[INFO] 
[INFO] --- maven-checkstyle-plugin:2.13:check (validate) @ phoenix-core ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ phoenix-core ---
[INFO] Test Source directory: <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/it/java> added.
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-resource (add-test-resource) @ phoenix-core ---
[INFO] 
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ phoenix-core ---
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/generated-sources/antlr3> added.
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/antlr3> added.
[INFO] 
[INFO] --- antlr3-maven-plugin:3.5.2:antlr (default) @ phoenix-core ---
[INFO] ANTLR: Processing source directory <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/antlr3>
ANTLR Parser Generator  Version 3.5.2
Output file <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/generated-sources/antlr3/org/apache/phoenix/parse/PhoenixSQLParser.java> does not exist: must build <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/antlr3/PhoenixSQL.g>
PhoenixSQL.g
[INFO] 
[INFO] --- maven-dependency-plugin:2.1:build-classpath (create-phoenix-generated-classpath) @ phoenix-core ---
[INFO] Wrote classpath file '<https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/cached_classpath.txt'.>
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix-core ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ phoenix-core ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource to META-INF/services
[INFO] Copying 3 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ phoenix-core ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 968 source files to <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/classes>
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[25,16] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[116,26] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[122,30] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[126,39] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java>: Recompile with -Xlint:deprecation for details.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java>: Some input files use unchecked or unsafe operations.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java>: Recompile with -Xlint:unchecked for details.
[INFO] 8 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1151,29] cannot find symbol
  symbol:   method getColumn(java.lang.String)
  location: variable family of type org.apache.phoenix.schema.PColumnFamily
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1153,28] cannot find symbol
  symbol:   method getColumn(java.lang.String)
  location: variable table of type org.apache.phoenix.schema.PTable
[INFO] 2 errors 
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  2.632 s]
[INFO] Phoenix Core ....................................... FAILURE [ 32.312 s]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 36.772 s
[INFO] Finished at: 2016-11-04T22:31:46+00:00
[INFO] Final Memory: 72M/1397M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project phoenix-core: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1151,29] cannot find symbol
[ERROR] symbol:   method getColumn(java.lang.String)
[ERROR] location: variable family of type org.apache.phoenix.schema.PColumnFamily
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1153,28] cannot find symbol
[ERROR] symbol:   method getColumn(java.lang.String)
[ERROR] location: variable table of type org.apache.phoenix.schema.PTable
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Compressed 124.83 KB of artifacts by 25.6% relative to #13
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?


Build failed in Jenkins: Phoenix-encode-columns #26

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/26/>

------------------------------------------
[...truncated 391 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.037 sec - in org.apache.phoenix.expression.SignFunctionTest
Running org.apache.phoenix.expression.ArrayToStringFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.359 sec - in org.apache.phoenix.expression.RegexpSplitFunctionTest
Running org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.063 sec - in org.apache.phoenix.expression.ArrayToStringFunctionTest
Running org.apache.phoenix.expression.RegexpSubstrFunctionTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.389 sec - in org.apache.phoenix.expression.NullValueTest
Running org.apache.phoenix.expression.OctetLengthFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.039 sec - in org.apache.phoenix.expression.OctetLengthFunctionTest
Running org.apache.phoenix.expression.CoerceExpressionTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.015 sec - in org.apache.phoenix.expression.CoerceExpressionTest
Running org.apache.phoenix.expression.SortOrderExpressionTest
Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.33 sec - in org.apache.phoenix.expression.ArrayConcatFunctionTest
Running org.apache.phoenix.expression.ArrayConstructorExpressionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.apache.phoenix.expression.ArrayConstructorExpressionTest
Running org.apache.phoenix.expression.PowerFunctionTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.06 sec - in org.apache.phoenix.expression.SortOrderExpressionTest
Running org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 sec - in org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
Running org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.04 sec - in org.apache.phoenix.expression.PowerFunctionTest
Running org.apache.phoenix.expression.function.InstrFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.013 sec - in org.apache.phoenix.expression.function.InstrFunctionTest
Running org.apache.phoenix.expression.ArrayFillFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.462 sec - in org.apache.phoenix.expression.RegexpSubstrFunctionTest
Running org.apache.phoenix.expression.RegexpReplaceFunctionTest
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.029 sec - in org.apache.phoenix.expression.ArrayFillFunctionTest
Running org.apache.phoenix.expression.SqrtFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 sec - in org.apache.phoenix.expression.RegexpReplaceFunctionTest
Running org.apache.phoenix.expression.CbrtFunctionTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.067 sec - in org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
Running org.apache.phoenix.expression.LnLogFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.006 sec - in org.apache.phoenix.expression.CbrtFunctionTest
Running org.apache.phoenix.expression.ColumnExpressionTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 sec - in org.apache.phoenix.expression.ColumnExpressionTest
Running org.apache.phoenix.expression.AbsFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 sec - in org.apache.phoenix.expression.SqrtFunctionTest
Running org.apache.phoenix.expression.StringToArrayFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.034 sec - in org.apache.phoenix.expression.LnLogFunctionTest
Running org.apache.phoenix.query.OrderByTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.029 sec - in org.apache.phoenix.expression.AbsFunctionTest
Running org.apache.phoenix.query.ConnectionlessTest
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 sec - in org.apache.phoenix.expression.StringToArrayFunctionTest
Running org.apache.phoenix.query.KeyRangeIntersectTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.083 sec - in org.apache.phoenix.query.KeyRangeIntersectTest
Running org.apache.phoenix.query.KeyRangeUnionTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.046 sec - in org.apache.phoenix.query.KeyRangeUnionTest
Running org.apache.phoenix.query.HBaseFactoryProviderTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 sec - in org.apache.phoenix.query.HBaseFactoryProviderTest
Running org.apache.phoenix.query.ScannerLeaseRenewalTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.294 sec - in org.apache.phoenix.query.ConnectionlessTest
Running org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.021 sec - in org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
Running org.apache.phoenix.query.ParallelIteratorsSplitTest
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.932 sec - in org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
Running org.apache.phoenix.query.QueryPlanTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.607 sec - in org.apache.phoenix.query.OrderByTest
Running org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
Running org.apache.phoenix.query.KeyRangeCoalesceTest
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.071 sec - in org.apache.phoenix.query.KeyRangeCoalesceTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.622 sec - in org.apache.phoenix.query.ParallelIteratorsSplitTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.593 sec - in org.apache.phoenix.query.QueryPlanTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.459 sec - in org.apache.phoenix.jdbc.SecureUserConnectionsTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.297 sec - in org.apache.phoenix.query.ScannerLeaseRenewalTest

Results :

Tests run: 1461, Failures: 0, Errors: 0, Skipped: 6

[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/26/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98-sources.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/26/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98-tests.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/26/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-core ---
[INFO] 
[INFO] --- maven-assembly-plugin:2.5.2:single (core) @ phoenix-core ---
[INFO] Reading assembly descriptor: src/build/phoenix-core.xml
[WARNING] Artifact: org.apache.phoenix:phoenix-core:jar:4.9.0-HBase-0.98 references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/26/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98.jar>
[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ParallelStatsEnabledTest) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
Running org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.874 sec - in org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.526 sec - in org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.749 sec - in org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.TransactionalViewIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.436 sec - in org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
Running org.apache.phoenix.end2end.ViewIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.792 sec - in org.apache.phoenix.end2end.TransactionalViewIT
Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.264 sec - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.886 sec - in org.apache.phoenix.end2end.SaltedViewIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 33.112 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.32 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 88.929 sec - in org.apache.phoenix.end2end.ViewIT

Results :

Tests run: 104, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ParallelStatsDisabledTest) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.AlterTableIT
Running org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.AppendOnlySchemaIT
Running org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.AlterTableWithViewsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.151 sec - in org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.403 sec - in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.ArrayAppendFunctionIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.822 sec - in org.apache.phoenix.end2end.AppendOnlySchemaIT
Running org.apache.phoenix.end2end.ArrayConcatFunctionIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.839 sec - in org.apache.phoenix.end2end.ArrayConcatFunctionIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.991 sec - in org.apache.phoenix.end2end.ArrayAppendFunctionIT
Running org.apache.phoenix.end2end.ArrayPrependFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.818 sec - in org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.795 sec - in org.apache.phoenix.end2end.ArrayPrependFunctionIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.07 sec - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.315 sec - in org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.AutoPartitionViewsIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.249 sec - in org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.BinaryRowKeyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.913 sec - in org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.585 sec - in org.apache.phoenix.end2end.CSVCommonsLoaderIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.412 sec - in org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.CoalesceFunctionIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.912 sec - in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.271 sec - in org.apache.phoenix.end2end.CoalesceFunctionIT
Running org.apache.phoenix.end2end.DateTimeIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.098 sec - in org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.486 sec - in org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.DefaultColumnValueIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 79.479 sec - in org.apache.phoenix.end2end.AlterTableWithViewsIT
Running org.apache.phoenix.end2end.DeleteIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.023 sec - in org.apache.phoenix.end2end.AutoPartitionViewsIT
Running org.apache.phoenix.end2end.DisableLocalIndexIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.807 sec - in org.apache.phoenix.end2end.DisableLocalIndexIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 24.763 sec - in org.apache.phoenix.end2end.DefaultColumnValueIT
Running org.apache.phoenix.end2end.DynamicColumnIT
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.988 sec - in org.apache.phoenix.end2end.DateTimeIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.33 sec - in org.apache.phoenix.end2end.DeleteIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 27.661 sec - in org.apache.phoenix.end2end.DynamicColumnIT
Running org.apache.phoenix.end2end.EncodeFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.771 sec - in org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.EvaluationOfORIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.788 sec - in org.apache.phoenix.end2end.EvaluationOfORIT
Running org.apache.phoenix.end2end.ExecuteStatementsIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.405 sec - in org.apache.phoenix.end2end.EncodeFunctionIT
Running org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.297 sec - in org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.015 sec - in org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.155 sec - in org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.FlappingAlterTableIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.998 sec - in org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.521 sec - in org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.GroupByCaseIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.163 sec - in org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.HashJoinLocalIndexIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.42 sec - in org.apache.phoenix.end2end.FlappingAlterTableIT
Running org.apache.phoenix.end2end.HashJoinMoreIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.82 sec - in org.apache.phoenix.end2end.GroupByCaseIT
Running org.apache.phoenix.end2end.InListIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.103 sec - in org.apache.phoenix.end2end.HashJoinLocalIndexIT
Running org.apache.phoenix.end2end.InstrFunctionIT
Running org.apache.phoenix.end2end.HashJoinIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.046 sec - in org.apache.phoenix.end2end.InstrFunctionIT
Running org.apache.phoenix.end2end.IsNullIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.509 sec - in org.apache.phoenix.end2end.HashJoinMoreIT
Running org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.499 sec - in org.apache.phoenix.end2end.IsNullIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.715 sec - in org.apache.phoenix.end2end.LastValueFunctionIT
Running org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.527 sec - in org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.593 sec - in org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.MapReduceIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.189 sec - in org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.MappingTableDataTypeIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.523 sec - in org.apache.phoenix.end2end.MapReduceIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.326 sec - in org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.ModulusExpressionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.855 sec - in org.apache.phoenix.end2end.MappingTableDataTypeIT
Running org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.584 sec - in org.apache.phoenix.end2end.ModulusExpressionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.035 sec - in org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.299 sec - in org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Running org.apache.phoenix.end2end.OnDuplicateKeyIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.395 sec - in org.apache.phoenix.end2end.NthValueFunctionIT
Running org.apache.phoenix.end2end.OrderByIT
Tests run: 61, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 274.446 sec - in org.apache.phoenix.end2end.AlterTableIT
Running org.apache.phoenix.end2end.PercentileIT

Results :

Tests run: 584, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ClientManagedTimeTests) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.ArrayIT
Running org.apache.phoenix.end2end.CaseStatementIT
Running org.apache.phoenix.end2end.AggregateQueryIT
Running org.apache.phoenix.end2end.ClientTimeArithmeticQueryIT
Running org.apache.phoenix.end2end.CastAndCoerceIT
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #25

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/25/changes>

Changes:

[samarth] Ignore DefaultColumnValueIT#testDefaultImmutableRows till PHOENIX-3442

------------------------------------------
[...truncated 515 lines...]
Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.745 sec - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.47 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 34.678 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 84.558 sec - in org.apache.phoenix.end2end.ViewIT

Results :

Tests run: 104, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ParallelStatsDisabledTest) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.AlterTableWithViewsIT
Running org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.AppendOnlySchemaIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.098 sec - in org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.AlterTableIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.179 sec - in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.ArrayAppendFunctionIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.87 sec - in org.apache.phoenix.end2end.AppendOnlySchemaIT
Running org.apache.phoenix.end2end.ArrayConcatFunctionIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.742 sec - in org.apache.phoenix.end2end.ArrayConcatFunctionIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.076 sec - in org.apache.phoenix.end2end.ArrayAppendFunctionIT
Running org.apache.phoenix.end2end.ArrayPrependFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.918 sec - in org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.257 sec - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.02 sec - in org.apache.phoenix.end2end.ArrayPrependFunctionIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.305 sec - in org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.AutoPartitionViewsIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 10.075 sec - in org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.BinaryRowKeyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.735 sec - in org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.819 sec - in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.953 sec - in org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.CoalesceFunctionIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.133 sec - in org.apache.phoenix.end2end.CSVCommonsLoaderIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 78.322 sec - in org.apache.phoenix.end2end.AlterTableWithViewsIT
Running org.apache.phoenix.end2end.DateTimeIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.626 sec - in org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.289 sec - in org.apache.phoenix.end2end.CoalesceFunctionIT
Running org.apache.phoenix.end2end.DefaultColumnValueIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.196 sec - in org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.DeleteIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.63 sec - in org.apache.phoenix.end2end.AutoPartitionViewsIT
Running org.apache.phoenix.end2end.DisableLocalIndexIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.742 sec - in org.apache.phoenix.end2end.DisableLocalIndexIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.625 sec - in org.apache.phoenix.end2end.DefaultColumnValueIT
Running org.apache.phoenix.end2end.DynamicColumnIT
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 41.128 sec - in org.apache.phoenix.end2end.DateTimeIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.837 sec - in org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.491 sec - in org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.EncodeFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.431 sec - in org.apache.phoenix.end2end.EncodeFunctionIT
Running org.apache.phoenix.end2end.EvaluationOfORIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.284 sec - in org.apache.phoenix.end2end.EvaluationOfORIT
Running org.apache.phoenix.end2end.ExecuteStatementsIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.87 sec - in org.apache.phoenix.end2end.DeleteIT
Running org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.036 sec - in org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.98 sec - in org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Running org.apache.phoenix.end2end.FlappingAlterTableIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.336 sec - in org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.324 sec - in org.apache.phoenix.end2end.DynamicColumnIT
Running org.apache.phoenix.end2end.GroupByCaseIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.309 sec - in org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.HashJoinIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.999 sec - in org.apache.phoenix.end2end.FlappingAlterTableIT
Running org.apache.phoenix.end2end.HashJoinLocalIndexIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 64.235 sec - in org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.HashJoinMoreIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.123 sec - in org.apache.phoenix.end2end.GroupByCaseIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.91 sec - in org.apache.phoenix.end2end.HashJoinLocalIndexIT
Running org.apache.phoenix.end2end.InListIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.597 sec - in org.apache.phoenix.end2end.HashJoinMoreIT
Running org.apache.phoenix.end2end.IsNullIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.2 sec - in org.apache.phoenix.end2end.IsNullIT
Running org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.579 sec - in org.apache.phoenix.end2end.LastValueFunctionIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.66 sec - in org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.783 sec - in org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.604 sec - in org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.MapReduceIT
Running org.apache.phoenix.end2end.InstrFunctionIT
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x000000079afbb000, 289054720, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 289054720 bytes for committing reserved memory.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/hs_err_pid9777.log>
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.007 sec - in org.apache.phoenix.end2end.InstrFunctionIT
Running org.apache.phoenix.end2end.MappingTableDataTypeIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.1 sec - in org.apache.phoenix.end2end.MappingTableDataTypeIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.335 sec - in org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.ModulusExpressionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.361 sec - in org.apache.phoenix.end2end.ModulusExpressionIT
Running org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.874 sec - in org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.326 sec - in org.apache.phoenix.end2end.NthValueFunctionIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.283 sec - in org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Running org.apache.phoenix.end2end.OnDuplicateKeyIT

Results :

Tests run: 521, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ClientManagedTimeTests) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.AggregateQueryIT
Running org.apache.phoenix.end2end.CaseStatementIT
Running org.apache.phoenix.end2end.ClientTimeArithmeticQueryIT
Running org.apache.phoenix.end2end.CastAndCoerceIT
Running org.apache.phoenix.end2end.ArrayIT

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (HBaseManagedTimeTests) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

Results :

Tests run: 0, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (NeedTheirOwnClusterTests) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
Running org.apache.phoenix.end2end.ConnectionUtilIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.123 sec - in org.apache.hadoop.hbase.regionserver.wal.WALReplayWithIndexWritesAndCompressedWALIT
Running org.apache.phoenix.end2end.CountDistinctCompressionIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Running org.apache.phoenix.end2end.CsvBulkLoadToolIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.243 sec - in org.apache.phoenix.end2end.ConnectionUtilIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.591 sec - in org.apache.phoenix.end2end.CountDistinctCompressionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.808 sec - in org.apache.phoenix.end2end.ContextClassloaderIT
Running org.apache.phoenix.end2end.FlappingLocalIndexIT
Running org.apache.phoenix.end2end.IndexExtendedIT
Running org.apache.phoenix.end2end.QueryTimeoutIT
Running org.apache.phoenix.end2end.QueryWithLimitIT
Tests run: 10, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 55.715 sec - in org.apache.phoenix.end2end.CsvBulkLoadToolIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.708 sec - in org.apache.phoenix.end2end.QueryWithLimitIT
Running org.apache.phoenix.end2end.RenewLeaseIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.002 sec - in org.apache.phoenix.end2end.RenewLeaseIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.355 sec - in org.apache.phoenix.end2end.QueryTimeoutIT
Tests run: 40, Failures: 0, Errors: 0, Skipped: 16, Time elapsed: 44.874 sec - in org.apache.phoenix.end2end.IndexExtendedIT
Running org.apache.phoenix.end2end.SpillableGroupByIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 63.804 sec - in org.apache.phoenix.end2end.FlappingLocalIndexIT
Running org.apache.phoenix.end2end.StatsCollectorIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.077 sec - in org.apache.phoenix.end2end.SpillableGroupByIT
Running org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
Running org.apache.phoenix.end2end.UserDefinedFunctionsIT
Running org.apache.phoenix.end2end.index.ImmutableIndexIT
Running org.apache.phoenix.end2end.index.LocalIndexIT
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.991 sec - in org.apache.phoenix.end2end.UserDefinedFunctionsIT
Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Java HotSpot(TM) 64-Bit Server VM warning: INFO: os::commit_memory(0x00000007e7600000, 340787200, 0) failed; error='Cannot allocate memory' (errno=12)
#
# There is insufficient memory for the Java Runtime Environment to continue.
# Native memory allocation (malloc) failed to allocate 340787200 bytes for committing reserved memory.
# An error report file with more information is saved as:
# <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/hs_err_pid15394.log>
Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.914 sec - in org.apache.phoenix.end2end.StatsCollectorIT
Tests run: 40, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 95.323 sec - in org.apache.phoenix.end2end.SysTableNamespaceMappedStatsCollectorIT
Running org.apache.phoenix.end2end.index.ReadOnlyIndexFailureIT
Running org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Running org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.875 sec - in org.apache.phoenix.hbase.index.FailForUnsupportedHBaseVersionsIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.211 sec - in org.apache.phoenix.hbase.index.covered.EndToEndCoveredColumnsIndexBuilderIT
Running org.apache.phoenix.hbase.index.covered.example.FailWithoutRetriesIT

Results :

Tests run: 169, Failures: 0, Errors: 0, Skipped: 17

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  3.511 s]
[INFO] Phoenix Core ....................................... FAILURE [16:27 min]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 16:33 min
[INFO] Finished at: 2016-11-22T03:41:06+00:00
[INFO] Final Memory: 79M/1287M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) on project phoenix-core: There was a timeout or other error in the fork -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #24

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/24/>

------------------------------------------
[...truncated 381 lines...]
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.apache.phoenix.expression.DeterminismTest
Running org.apache.phoenix.expression.GetSetByteBitFunctionTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.529 sec - in org.apache.phoenix.index.IndexMaintainerTest
Running org.apache.phoenix.expression.util.regex.PatternPerformanceTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 sec - in org.apache.phoenix.expression.util.regex.PatternPerformanceTest
Running org.apache.phoenix.expression.ArrayAppendFunctionTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.24 sec - in org.apache.phoenix.expression.GetSetByteBitFunctionTest
Running org.apache.phoenix.expression.RegexpSplitFunctionTest
Tests run: 37, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.552 sec - in org.apache.phoenix.expression.ArrayPrependFunctionTest
Running org.apache.phoenix.expression.ILikeExpressionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.267 sec - in org.apache.phoenix.expression.RegexpSplitFunctionTest
Running org.apache.phoenix.expression.SignFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 sec - in org.apache.phoenix.expression.SignFunctionTest
Running org.apache.phoenix.expression.NullValueTest
Tests run: 35, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.767 sec - in org.apache.phoenix.expression.ArrayConcatFunctionTest
Running org.apache.phoenix.expression.ArrayToStringFunctionTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.065 sec - in org.apache.phoenix.expression.ArrayToStringFunctionTest
Running org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.34 sec - in org.apache.phoenix.expression.ILikeExpressionTest
Running org.apache.phoenix.expression.RegexpSubstrFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.061 sec - in org.apache.phoenix.expression.RegexpSubstrFunctionTest
Running org.apache.phoenix.expression.OctetLengthFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.048 sec - in org.apache.phoenix.expression.OctetLengthFunctionTest
Running org.apache.phoenix.expression.CoerceExpressionTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.004 sec - in org.apache.phoenix.expression.CoerceExpressionTest
Running org.apache.phoenix.expression.SortOrderExpressionTest
Tests run: 24, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.052 sec - in org.apache.phoenix.expression.SortOrderExpressionTest
Running org.apache.phoenix.expression.ArrayConstructorExpressionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 sec - in org.apache.phoenix.expression.ArrayConstructorExpressionTest
Running org.apache.phoenix.expression.PowerFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 sec - in org.apache.phoenix.expression.PowerFunctionTest
Running org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.007 sec - in org.apache.phoenix.expression.function.ExternalSqlTypeIdFunctionTest
Running org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.115 sec - in org.apache.phoenix.expression.function.BuiltinFunctionConstructorTest
Running org.apache.phoenix.expression.function.InstrFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01 sec - in org.apache.phoenix.expression.function.InstrFunctionTest
Running org.apache.phoenix.expression.ArrayFillFunctionTest
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.029 sec - in org.apache.phoenix.expression.ArrayFillFunctionTest
Running org.apache.phoenix.expression.RegexpReplaceFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.014 sec - in org.apache.phoenix.expression.RegexpReplaceFunctionTest
Running org.apache.phoenix.expression.SqrtFunctionTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.7 sec - in org.apache.phoenix.expression.NullValueTest
Running org.apache.phoenix.expression.CbrtFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.029 sec - in org.apache.phoenix.expression.SqrtFunctionTest
Running org.apache.phoenix.expression.LnLogFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.02 sec - in org.apache.phoenix.expression.CbrtFunctionTest
Running org.apache.phoenix.expression.ColumnExpressionTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.002 sec - in org.apache.phoenix.expression.ColumnExpressionTest
Running org.apache.phoenix.expression.AbsFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.031 sec - in org.apache.phoenix.expression.AbsFunctionTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.063 sec - in org.apache.phoenix.expression.LnLogFunctionTest
Running org.apache.phoenix.expression.StringToArrayFunctionTest
Running org.apache.phoenix.query.OrderByTest
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.023 sec - in org.apache.phoenix.expression.StringToArrayFunctionTest
Running org.apache.phoenix.query.ConnectionlessTest
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.362 sec - in org.apache.phoenix.expression.ArrayAppendFunctionTest
Running org.apache.phoenix.query.KeyRangeIntersectTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.077 sec - in org.apache.phoenix.query.KeyRangeIntersectTest
Running org.apache.phoenix.query.KeyRangeUnionTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.061 sec - in org.apache.phoenix.query.KeyRangeUnionTest
Running org.apache.phoenix.query.HBaseFactoryProviderTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 sec - in org.apache.phoenix.query.HBaseFactoryProviderTest
Running org.apache.phoenix.query.ScannerLeaseRenewalTest
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.354 sec - in org.apache.phoenix.query.ConnectionlessTest
Running org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
Tests run: 22, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.036 sec - in org.apache.phoenix.query.EncodedColumnQualifierCellsListTest
Running org.apache.phoenix.query.ParallelIteratorsSplitTest
Tests run: 23, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.064 sec - in org.apache.phoenix.expression.RoundFloorCeilExpressionsTest
Running org.apache.phoenix.query.QueryPlanTest
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 15.043 sec - in org.apache.phoenix.jdbc.SecureUserConnectionsTest
Running org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.591 sec - in org.apache.phoenix.query.OrderByTest
Running org.apache.phoenix.query.KeyRangeCoalesceTest
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 sec - in org.apache.phoenix.query.PhoenixStatsCacheRemovalListenerTest
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.131 sec - in org.apache.phoenix.query.KeyRangeCoalesceTest
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.438 sec - in org.apache.phoenix.query.ParallelIteratorsSplitTest
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.467 sec - in org.apache.phoenix.query.QueryPlanTest
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.962 sec - in org.apache.phoenix.query.ScannerLeaseRenewalTest

Results :

Tests run: 1461, Failures: 0, Errors: 0, Skipped: 6

[INFO] 
[INFO] --- maven-source-plugin:2.2.1:jar-no-fork (attach-sources) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/24/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98-sources.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:test-jar (default) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/24/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98-tests.jar>
[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ phoenix-core ---
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/24/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98.jar>
[INFO] 
[INFO] --- maven-site-plugin:3.2:attach-descriptor (attach-descriptor) @ phoenix-core ---
[INFO] 
[INFO] --- maven-assembly-plugin:2.5.2:single (core) @ phoenix-core ---
[INFO] Reading assembly descriptor: src/build/phoenix-core.xml
[WARNING] Artifact: org.apache.phoenix:phoenix-core:jar:4.9.0-HBase-0.98 references the same file as the assembly destination file. Moving it to a temporary location for inclusion.
[INFO] Building jar: <https://builds.apache.org/job/Phoenix-encode-columns/24/artifact/phoenix-core/target/phoenix-core-4.9.0-HBase-0.98.jar>
[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ParallelStatsEnabledTest) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.SaltedViewIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.137 sec - in org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.589 sec - in org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.641 sec - in org.apache.phoenix.coprocessor.StatisticsCollectionRunTrackerIT
Running org.apache.phoenix.end2end.TransactionalViewIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 14.58 sec - in org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.ViewIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.95 sec - in org.apache.phoenix.end2end.TransactionalViewIT
Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.326 sec - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 21.887 sec - in org.apache.phoenix.end2end.SaltedViewIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 31.957 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 37.651 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 81.797 sec - in org.apache.phoenix.end2end.ViewIT

Results :

Tests run: 104, Failures: 0, Errors: 0, Skipped: 0

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:integration-test (ParallelStatsDisabledTest) @ phoenix-core ---

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.phoenix.end2end.AlterTableIT
Running org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.AppendOnlySchemaIT
Running org.apache.phoenix.end2end.AlterTableWithViewsIT
Running org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.583 sec - in org.apache.phoenix.end2end.AbsFunctionEnd2EndIT
Running org.apache.phoenix.end2end.ArithmeticQueryIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.024 sec - in org.apache.phoenix.end2end.ArithmeticQueryIT
Running org.apache.phoenix.end2end.ArrayAppendFunctionIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.31 sec - in org.apache.phoenix.end2end.AppendOnlySchemaIT
Running org.apache.phoenix.end2end.ArrayConcatFunctionIT
Tests run: 30, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.978 sec - in org.apache.phoenix.end2end.ArrayConcatFunctionIT
Running org.apache.phoenix.end2end.ArrayFillFunctionIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 19.851 sec - in org.apache.phoenix.end2end.ArrayAppendFunctionIT
Running org.apache.phoenix.end2end.ArrayPrependFunctionIT
Tests run: 26, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.096 sec - in org.apache.phoenix.end2end.ArrayFillFunctionIT
Running org.apache.phoenix.end2end.ArrayToStringFunctionIT
Tests run: 34, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.914 sec - in org.apache.phoenix.end2end.ArrayPrependFunctionIT
Running org.apache.phoenix.end2end.ArraysWithNullsIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 54.592 sec - in org.apache.phoenix.end2end.AlterMultiTenantTableWithViewsIT
Running org.apache.phoenix.end2end.AutoCommitIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.823 sec - in org.apache.phoenix.end2end.AutoCommitIT
Running org.apache.phoenix.end2end.AutoPartitionViewsIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.088 sec - in org.apache.phoenix.end2end.ArraysWithNullsIT
Running org.apache.phoenix.end2end.BinaryRowKeyIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.936 sec - in org.apache.phoenix.end2end.BinaryRowKeyIT
Running org.apache.phoenix.end2end.CSVCommonsLoaderIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.652 sec - in org.apache.phoenix.end2end.CSVCommonsLoaderIT
Running org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.812 sec - in org.apache.phoenix.end2end.CbrtFunctionEnd2EndIT
Running org.apache.phoenix.end2end.CoalesceFunctionIT
Tests run: 36, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 25.88 sec - in org.apache.phoenix.end2end.ArrayToStringFunctionIT
Running org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.319 sec - in org.apache.phoenix.end2end.ConvertTimezoneFunctionIT
Running org.apache.phoenix.end2end.DateTimeIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.436 sec - in org.apache.phoenix.end2end.CoalesceFunctionIT
Running org.apache.phoenix.end2end.DecodeFunctionIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 77.04 sec - in org.apache.phoenix.end2end.AlterTableWithViewsIT
Running org.apache.phoenix.end2end.DefaultColumnValueIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.482 sec - in org.apache.phoenix.end2end.DecodeFunctionIT
Running org.apache.phoenix.end2end.DeleteIT
Tests run: 12, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 35.594 sec - in org.apache.phoenix.end2end.AutoPartitionViewsIT
Running org.apache.phoenix.end2end.DisableLocalIndexIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.763 sec - in org.apache.phoenix.end2end.DisableLocalIndexIT
Running org.apache.phoenix.end2end.DistinctPrefixFilterIT
Tests run: 24, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 26.072 sec <<< FAILURE! - in org.apache.phoenix.end2end.DefaultColumnValueIT
testDefaultImmutableRows(org.apache.phoenix.end2end.DefaultColumnValueIT)  Time elapsed: 0.683 sec  <<< FAILURE!
java.lang.AssertionError: expected:<0> but was:<50>
	at org.apache.phoenix.end2end.DefaultColumnValueIT.testDefaultImmutableRows(DefaultColumnValueIT.java:303)

Running org.apache.phoenix.end2end.DynamicColumnIT
Tests run: 49, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 38.637 sec - in org.apache.phoenix.end2end.DateTimeIT
Tests run: 25, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 47.931 sec - in org.apache.phoenix.end2end.DeleteIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 28.718 sec - in org.apache.phoenix.end2end.DynamicColumnIT
Running org.apache.phoenix.end2end.DynamicUpsertIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.312 sec - in org.apache.phoenix.end2end.DynamicUpsertIT
Running org.apache.phoenix.end2end.EvaluationOfORIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.324 sec - in org.apache.phoenix.end2end.EvaluationOfORIT
Running org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.DynamicFamilyIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.973 sec - in org.apache.phoenix.end2end.DynamicFamilyIT
Running org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.532 sec - in org.apache.phoenix.end2end.ExpFunctionEnd2EndIT
Running org.apache.phoenix.end2end.FirstValueFunctionIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.59 sec - in org.apache.phoenix.end2end.FirstValueFunctionIT
Running org.apache.phoenix.end2end.FlappingAlterTableIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.882 sec - in org.apache.phoenix.end2end.ExecuteStatementsIT
Running org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.36 sec - in org.apache.phoenix.end2end.GetSetByteBitFunctionEnd2EndIT
Running org.apache.phoenix.end2end.GroupByCaseIT
Running org.apache.phoenix.end2end.EncodeFunctionIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.847 sec - in org.apache.phoenix.end2end.EncodeFunctionIT
Running org.apache.phoenix.end2end.HashJoinIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 66.786 sec - in org.apache.phoenix.end2end.DistinctPrefixFilterIT
Running org.apache.phoenix.end2end.HashJoinLocalIndexIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.497 sec - in org.apache.phoenix.end2end.FlappingAlterTableIT
Running org.apache.phoenix.end2end.HashJoinMoreIT
Tests run: 16, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 18.729 sec - in org.apache.phoenix.end2end.GroupByCaseIT
Running org.apache.phoenix.end2end.InListIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.38 sec - in org.apache.phoenix.end2end.HashJoinLocalIndexIT
Running org.apache.phoenix.end2end.InstrFunctionIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.406 sec - in org.apache.phoenix.end2end.HashJoinMoreIT
Running org.apache.phoenix.end2end.IsNullIT
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.949 sec - in org.apache.phoenix.end2end.InstrFunctionIT
Running org.apache.phoenix.end2end.LastValueFunctionIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.437 sec - in org.apache.phoenix.end2end.IsNullIT
Running org.apache.phoenix.end2end.LikeExpressionIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.334 sec - in org.apache.phoenix.end2end.LastValueFunctionIT
Running org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.925 sec - in org.apache.phoenix.end2end.LikeExpressionIT
Running org.apache.phoenix.end2end.MD5FunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.39 sec - in org.apache.phoenix.end2end.LnLogFunctionEnd2EndIT
Running org.apache.phoenix.end2end.MapReduceIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.367 sec - in org.apache.phoenix.end2end.MD5FunctionIT
Running org.apache.phoenix.end2end.MappingTableDataTypeIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.105 sec - in org.apache.phoenix.end2end.MapReduceIT
Running org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.565 sec - in org.apache.phoenix.end2end.MinMaxAggregateFunctionIT
Running org.apache.phoenix.end2end.ModulusExpressionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.771 sec - in org.apache.phoenix.end2end.MappingTableDataTypeIT
Running org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 5.245 sec - in org.apache.phoenix.end2end.ModulusExpressionIT
Running org.apache.phoenix.end2end.NthValueFunctionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.336 sec - in org.apache.phoenix.end2end.NamespaceSchemaMappingIT
Running org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.818 sec - in org.apache.phoenix.end2end.OctetLengthFunctionEnd2EndIT
Running org.apache.phoenix.end2end.OnDuplicateKeyIT
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.56 sec - in org.apache.phoenix.end2end.NthValueFunctionIT
Running org.apache.phoenix.end2end.OrderByIT
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #23

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/23/changes>

Changes:

[samarth] Fix test failure

------------------------------------------
[...truncated 1072 lines...]
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE6
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,407.411 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1202402: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneIndexTable(CsvBulkLoadToolIT.java:309)
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(CsvBulkLoadToolIT.java:297)
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1202402: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.net.SocketTimeoutException: Call to asf914.gq1.ygridcore.net/67.195.81.158:35948 failed because java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.158:56306 remote=asf914.gq1.ygridcore.net/67.195.81.158:35948]
Caused by: java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.158:56306 remote=asf914.gq1.ygridcore.net/67.195.81.158:35948]

testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,407.412 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1222555: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1222555: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testInvalidArguments(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,199.97 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221909: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221909: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportWithIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,205.306 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1222121: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1222121: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,203.85 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221970: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221970: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478639182330.f6c4adc266ec47261b7f9c5ca35b8af1., hostname=asf914.gq1.ygridcore.net,35948,1478639176615, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)



Results :

Tests in error: 
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportOneLocalIndexTable:297->testImportOneIndexTable:309 » PhoenixIO
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportWithLocalIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportWithLocalIndex:258 » PhoenixIO callTimeout=1200000...
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

Tests run: 207, Failures: 0, Errors: 8, Skipped: 17

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  1.821 s]
[INFO] Phoenix Core ....................................... FAILURE [  03:58 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:58 h
[INFO] Finished at: 2016-11-09T00:27:33+00:00
[INFO] Final Memory: 59M/1165M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #22

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/22/>

------------------------------------------
[...truncated 6910 lines...]
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,406.308 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1202405: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneIndexTable(CsvBulkLoadToolIT.java:309)
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(CsvBulkLoadToolIT.java:297)
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1202405: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.net.SocketTimeoutException: Call to asf914.gq1.ygridcore.net/67.195.81.158:36794 failed because java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.158:40248 remote=asf914.gq1.ygridcore.net/67.195.81.158:36794]
Caused by: java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.158:40248 remote=asf914.gq1.ygridcore.net/67.195.81.158:36794]

testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,406.309 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221879: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221879: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testInvalidArguments(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,200.181 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1222076: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1222076: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportWithIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,205.637 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221935: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221935: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,205.12 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1222008: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1222008: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478593999496.fe304a602458c05840319b1c93a814b1., hostname=asf914.gq1.ygridcore.net,36794,1478593993736, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)



Results :

Failed tests: 
  IndexExtendedIT.testSecondaryIndex:262 expected:<...LTER BY (LPAD(UPPER([]NAME), 8, 'x') || '_...> but was:<...LTER BY (LPAD(UPPER([0.]NAME), 8, 'x') || '_...>
  IndexExtendedIT.testSecondaryIndex:262 expected:<...LTER BY (LPAD(UPPER([]NAME), 8, 'x') || '_...> but was:<...LTER BY (LPAD(UPPER([0.]NAME), 8, 'x') || '_...>
Tests in error: 
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportOneLocalIndexTable:297->testImportOneIndexTable:309 » PhoenixIO
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportWithLocalIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportWithLocalIndex:258 » PhoenixIO callTimeout=1200000...
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

Tests run: 207, Failures: 2, Errors: 8, Skipped: 17

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  1.882 s]
[INFO] Phoenix Core ....................................... FAILURE [  03:58 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:58 h
[INFO] Finished at: 2016-11-08T11:54:30+00:00
[INFO] Final Memory: 62M/1229M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #21

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/21/changes>

Changes:

[samarth] Fix compilation failure

------------------------------------------
[...truncated 3478 lines...]
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,406.081 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1202377: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneIndexTable(CsvBulkLoadToolIT.java:309)
	at org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(CsvBulkLoadToolIT.java:297)
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1202377: row '  TABLE5_IDX' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.net.SocketTimeoutException: Call to asf912.gq1.ygridcore.net/67.195.81.156:33372 failed because java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.156:35856 remote=asf912.gq1.ygridcore.net/67.195.81.156:33372]
Caused by: java.net.SocketTimeoutException: 1200000 millis timeout while waiting for channel to be ready for read. ch : java.nio.channels.SocketChannel[connected local=/67.195.81.156:35856 remote=asf912.gq1.ygridcore.net/67.195.81.156:33372]

testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 2,406.083 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1222001: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1222001: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testInvalidArguments(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,199.882 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221893: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221893: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportWithIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,204.287 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221813: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221813: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)


testImportOneIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)  Time elapsed: 1,205.342 sec  <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: callTimeout=1200000, callDuration=1221716: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.net.SocketTimeoutException: callTimeout=1200000, callDuration=1221716: row '  TABLE5' on table 'SYSTEM.CATALOG' at region=SYSTEM.CATALOG,,1478556212597.8858e1c7c8ab85951f2a9cec321f57ed., hostname=asf912.gq1.ygridcore.net,33372,1478556206799, seqNum=1
Caused by: java.io.IOException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)

Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: 
java.io.IOException: Timed out waiting for lock for row: \x00\x00TABLE5
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLockInternal(HRegion.java:3804)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3766)
	at org.apache.hadoop.hbase.regionserver.HRegion.getRowLock(HRegion.java:3830)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.acquireLock(MetaDataEndpointImpl.java:1604)
	at org.apache.phoenix.coprocessor.MetaDataEndpointImpl.dropTable(MetaDataEndpointImpl.java:1746)
	at org.apache.phoenix.coprocessor.generated.MetaDataProtos$MetaDataService.callMethod(MetaDataProtos.java:16297)
	at org.apache.hadoop.hbase.regionserver.HRegion.execService(HRegion.java:6041)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execServiceOnRegion(HRegionServer.java:3520)
	at org.apache.hadoop.hbase.regionserver.HRegionServer.execService(HRegionServer.java:3502)
	at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31194)
	at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2149)
	at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
	at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
	at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
	at java.lang.Thread.run(Thread.java:745)



Results :

Failed tests: 
  IndexExtendedIT.testSecondaryIndex:262 expected:<...LTER BY (LPAD(UPPER([]NAME), 8, 'x') || '_...> but was:<...LTER BY (LPAD(UPPER([0.]NAME), 8, 'x') || '_...>
  IndexExtendedIT.testSecondaryIndex:262 expected:<...LTER BY (LPAD(UPPER([]NAME), 8, 'x') || '_...> but was:<...LTER BY (LPAD(UPPER([0.]NAME), 8, 'x') || '_...>
Tests in error: 
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportOneLocalIndexTable(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportOneLocalIndexTable:297->testImportOneIndexTable:309 » PhoenixIO
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
org.apache.phoenix.end2end.CsvBulkLoadToolIT.testImportWithLocalIndex(org.apache.phoenix.end2end.CsvBulkLoadToolIT)
  Run 1: CsvBulkLoadToolIT.testImportWithLocalIndex:258 » PhoenixIO callTimeout=1200000...
  Run 2: CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO
  CsvBulkLoadToolIT>BaseOwnClusterIT.cleanUpAfterTest:35->BaseTest.deletePriorMetaData:857->BaseTest.deletePriorTables:865->BaseTest.deletePriorTables:876->BaseTest.deletePriorTables:921 » PhoenixIO

Tests run: 171, Failures: 2, Errors: 8, Skipped: 17

[INFO] 
[INFO] --- maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) @ phoenix-core ---
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  1.757 s]
[INFO] Phoenix Core ....................................... FAILURE [  03:58 h]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 03:58 h
[INFO] Finished at: 2016-11-08T01:24:43+00:00
[INFO] Final Memory: 62M/1299M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-failsafe-plugin:2.19.1:verify (ParallelStatsEnabledTest) on project phoenix-core: There are test failures.
[ERROR] 
[ERROR] Please refer to <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/failsafe-reports> for the individual test results.
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Recording test results

Build failed in Jenkins: Phoenix-encode-columns #20

Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-encode-columns/20/>

------------------------------------------
[...truncated 39489 lines...]
72/625 KB   118/118 KB   350/350 KB   
76/625 KB   118/118 KB   350/350 KB   
80/625 KB   118/118 KB   350/350 KB   
84/625 KB   118/118 KB   350/350 KB   
88/625 KB   118/118 KB   350/350 KB   
92/625 KB   118/118 KB   350/350 KB   
96/625 KB   118/118 KB   350/350 KB   
100/625 KB   118/118 KB   350/350 KB   
104/625 KB   118/118 KB   350/350 KB   
108/625 KB   118/118 KB   350/350 KB   
112/625 KB   118/118 KB   350/350 KB   
116/625 KB   118/118 KB   350/350 KB   
120/625 KB   118/118 KB   350/350 KB   
124/625 KB   118/118 KB   350/350 KB   
128/625 KB   118/118 KB   350/350 KB   
132/625 KB   118/118 KB   350/350 KB   
136/625 KB   118/118 KB   350/350 KB   
140/625 KB   118/118 KB   350/350 KB   
144/625 KB   118/118 KB   350/350 KB   
148/625 KB   118/118 KB   350/350 KB   
152/625 KB   118/118 KB   350/350 KB   
156/625 KB   118/118 KB   350/350 KB   
160/625 KB   118/118 KB   350/350 KB   
164/625 KB   118/118 KB   350/350 KB   
168/625 KB   118/118 KB   350/350 KB   
172/625 KB   118/118 KB   350/350 KB   
176/625 KB   118/118 KB   350/350 KB   
180/625 KB   118/118 KB   350/350 KB   
184/625 KB   118/118 KB   350/350 KB   
188/625 KB   118/118 KB   350/350 KB   
192/625 KB   118/118 KB   350/350 KB   
196/625 KB   118/118 KB   350/350 KB   
200/625 KB   118/118 KB   350/350 KB   
204/625 KB   118/118 KB   350/350 KB   
208/625 KB   118/118 KB   350/350 KB   
212/625 KB   118/118 KB   350/350 KB   
216/625 KB   118/118 KB   350/350 KB   
220/625 KB   118/118 KB   350/350 KB   
224/625 KB   118/118 KB   350/350 KB   
228/625 KB   118/118 KB   350/350 KB   
232/625 KB   118/118 KB   350/350 KB   
236/625 KB   118/118 KB   350/350 KB   
240/625 KB   118/118 KB   350/350 KB   
244/625 KB   118/118 KB   350/350 KB   
248/625 KB   118/118 KB   350/350 KB   
252/625 KB   118/118 KB   350/350 KB   
256/625 KB   118/118 KB   350/350 KB   
260/625 KB   118/118 KB   350/350 KB   
                                       
264/625 KB   118/118 KB                
268/625 KB   118/118 KB   
Downloaded: https://repo.maven.apache.org/maven2/log4j/log4j/1.2.12/log4j-1.2.12.jar (350 KB at 5550.7 KB/sec)
272/625 KB   118/118 KB   
                          
Downloaded: https://repo.maven.apache.org/maven2/junit/junit/3.8.2/junit-3.8.2.jar (118 KB at 1870.0 KB/sec)
276/625 KB                
280/625 KB   
284/625 KB   
288/625 KB   
292/625 KB   
296/625 KB   
300/625 KB   
304/625 KB   
308/625 KB   
312/625 KB   
316/625 KB   
320/625 KB   
324/625 KB   
328/625 KB   
332/625 KB   
336/625 KB   
340/625 KB   
344/625 KB   
348/625 KB   
352/625 KB   
356/625 KB   
360/625 KB   
364/625 KB   
368/625 KB   
372/625 KB   
376/625 KB   
380/625 KB   
384/625 KB   
388/625 KB   
392/625 KB   
396/625 KB   
400/625 KB   
404/625 KB   
408/625 KB   
412/625 KB   
416/625 KB   
420/625 KB   
424/625 KB   
428/625 KB   
432/625 KB   
436/625 KB   
440/625 KB   
444/625 KB   
448/625 KB   
452/625 KB   
456/625 KB   
460/625 KB   
464/625 KB   
468/625 KB   
472/625 KB   
476/625 KB   
480/625 KB   
484/625 KB   
488/625 KB   
492/625 KB   
496/625 KB   
500/625 KB   
504/625 KB   
508/625 KB   
512/625 KB   
516/625 KB   
520/625 KB   
524/625 KB   
528/625 KB   
532/625 KB   
536/625 KB   
540/625 KB   
544/625 KB   
548/625 KB   
552/625 KB   
556/625 KB   
560/625 KB   
564/625 KB   
568/625 KB   
572/625 KB   
576/625 KB   
580/625 KB   
584/625 KB   
588/625 KB   
592/625 KB   
596/625 KB   
600/625 KB   
604/625 KB   
608/625 KB   
612/625 KB   
616/625 KB   
620/625 KB   
624/625 KB   
625/625 KB   
             
Downloaded: https://repo.maven.apache.org/maven2/com/google/collections/google-collections/1.0/google-collections-1.0.jar (625 KB at 7711.1 KB/sec)
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 968 source files to <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/target/classes>
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING : 
[INFO] -------------------------------------------------------------
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[25,16] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[116,26] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[122,30] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/execute/DescVarLengthFastByteComparisons.java>:[126,39] sun.misc.Unsafe is internal proprietary API and may be removed in a future release
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/mapreduce/index/IndexTool.java>: Recompile with -Xlint:deprecation for details.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java>: Some input files use unchecked or unsafe operations.
[WARNING] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/jdbc/PhoenixStatement.java>: Recompile with -Xlint:unchecked for details.
[INFO] 8 warnings 
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR : 
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1151,29] cannot find symbol
  symbol:   method getColumn(java.lang.String)
  location: variable family of type org.apache.phoenix.schema.PColumnFamily
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1153,28] cannot find symbol
  symbol:   method getColumn(java.lang.String)
  location: variable table of type org.apache.phoenix.schema.PTable
[INFO] 2 errors 
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO] 
[INFO] Apache Phoenix ..................................... SUCCESS [  3.635 s]
[INFO] Phoenix Core ....................................... FAILURE [02:47 min]
[INFO] Phoenix - Flume .................................... SKIPPED
[INFO] Phoenix - Pig ...................................... SKIPPED
[INFO] Phoenix Query Server Client ........................ SKIPPED
[INFO] Phoenix Query Server ............................... SKIPPED
[INFO] Phoenix - Pherf .................................... SKIPPED
[INFO] Phoenix - Spark .................................... SKIPPED
[INFO] Phoenix - Hive ..................................... SKIPPED
[INFO] Phoenix Client ..................................... SKIPPED
[INFO] Phoenix Server ..................................... SKIPPED
[INFO] Phoenix Assembly ................................... SKIPPED
[INFO] Phoenix - Tracing Web Application .................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 02:53 min
[INFO] Finished at: 2016-11-07T21:24:53+00:00
[INFO] Final Memory: 79M/1963M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project phoenix-core: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1151,29] cannot find symbol
[ERROR] symbol:   method getColumn(java.lang.String)
[ERROR] location: variable family of type org.apache.phoenix.schema.PColumnFamily
[ERROR] <https://builds.apache.org/job/Phoenix-encode-columns/ws/phoenix-core/src/main/java/org/apache/phoenix/util/PhoenixRuntime.java>:[1153,28] cannot find symbol
[ERROR] symbol:   method getColumn(java.lang.String)
[ERROR] location: variable table of type org.apache.phoenix.schema.PTable
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
ERROR: Failed to archive artifacts: **/surefire-reports/*,**/failsafe-reports/*.xml,**/*.txt,**/*.jar,**/bin/*.sh,**/bin/*.*,**/bin/*.xml,**/bin/*.properties
hudson.util.IOException2: java.lang.InterruptedException
stream=1f8b0800000000000000ed3d07605445d388027280a0a280283c2ec4d4bb4b03424c8090028134522806485eee5e72975ce3de5d0a880a2a284811a4884815a5a9145194a6207e165094aa20222855049422d67f66f7bdbbf7ae840b84fffbfcbe4449deb699d9363b33bb3bfbf8d8db568e993da9d860d6e858ce6431abadd5b1a3c74cc308ceac8b827f791c6f87d821935fc6483dabb358ac2a1367b719b47c944a5fccf29cda6ab358399bddc0f1f9cfbee22d9b556fe1cc862a4946eb2c82849457f1063ba7ae3219e39f7e1e238d96d2983249de36cf93bc102ab1d84cac590b18ab1b2c2579ad7a8855698d0edeced9203af28559ae68dece9a75acd162c602213366d214424a210f49c51620a93a79eaf3d20487dd60e4215af1f14412cd8f3042287a06a1618483b355f39cad82204bdb3819236d1cab33716a7b95fdfec973314280adf2a8c8a4e9247984d16830732abbde800d1eb57cb22416220a163e871176ceaab7b12ace5ca1e6f58e2904bfddc66a39277eddbaa502d52aadc5c669ecacad94b36bb4ac56cfe90ab54696e7adac5d8f94f5bc7fcc8b625e20c86ed15a8c9a9c94c4e48c144ceef3e44631155118cca5aa4aae98b55a35bc4dab31b1805a6b3197184a35005a5bcedbab8d9c460ff5063aa07887714bfcccb74ca051c416a3eeae8e50f5ed8dc32042dd3d566587f1c6abcb585be6ba804e1a076fd3d0c158c158abed7a8b5911c0a842558cd6a20322e31887bd44158b318a803afc51c07f4cba41cb99794ec7d82d8c5dcf3189566c5726d75262af646d1c936a71c0e8b21b2c662638313735848120676360b441618b8d314197009d669803c50e3b44182940862db5719c8933433d192697e308f4ccacbcb4a414a6c460c4e23a034f8b01fa4a835d0f590c3c5369b1953330071856a7332062d6c818cc64526008cad9b852d6862d0388add53643a9dece582acd9c8dd71bac802d0feb919b2a92c253b80252a867b5c52154435263a121c299810007ab1ba58e6082210394510a89ca90874861135bcd982d76c6c1732ed80c57a5e5ac76a015c832598d069cc35058a8991303103844806129b6c3606258520fc65222cdc6b076d2417abbdd1aa7d1545656aa5942a9da622bd58855d3a4437b66e6a6a8805a923ddf6ce4781e9a6884c36083762dae6660804376b6184834b295d869a46f489703f64a1bb4b2b9349ce1853e77eb19574b89944185a519a0ad5833a34ccc65d272954cefc4dcb4dc708031282daf6f567e1e332831272731332f2d2597c9ca6192b23293d3f2d2b2322194ca24660e61faa7652687331cb412a0e1aaac36a41f8834601b723a358012878f48008e0e0cf3564e6b283168a15ee652075bca31a516601a661c1ac08d4c061e7b9207f27400c5683019ec640cf19e955263ebd5e5e4
	at com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.inflate(JSyncArtifactManager.java:286)
	at com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.remoteSync(JSyncArtifactManager.java:155)
	at com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.archive(JSyncArtifactManager.java:76)
	at hudson.tasks.ArtifactArchiver.perform(ArtifactArchiver.java:238)
	at hudson.tasks.BuildStepCompatibilityLayer.perform(BuildStepCompatibilityLayer.java:78)
	at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
	at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:779)
	at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:720)
	at hudson.model.Build$BuildExecution.post2(Build.java:185)
	at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:665)
	at hudson.model.Run.execute(Run.java:1766)
	at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
	at hudson.model.ResourceController.execute(ResourceController.java:98)
	at hudson.model.Executor.run(Executor.java:410)
Caused by: java.io.IOException: java.lang.InterruptedException
	at hudson.remoting.FastPipedInputStream.read(FastPipedInputStream.java:177)
	at hudson.remoting.FastPipedInputStream.read(FastPipedInputStream.java:146)
	at hudson.util.HeadBufferingStream.read(HeadBufferingStream.java:53)
	at java.util.zip.CheckedInputStream.read(CheckedInputStream.java:59)
	at java.util.zip.GZIPInputStream.readUByte(GZIPInputStream.java:266)
	at java.util.zip.GZIPInputStream.readUShort(GZIPInputStream.java:258)
	at java.util.zip.GZIPInputStream.readHeader(GZIPInputStream.java:164)
	at java.util.zip.GZIPInputStream.<init>(GZIPInputStream.java:79)
	at com.cloudbees.jenkins.plugins.jsync.archiver.JSyncArtifactManager.inflate(JSyncArtifactManager.java:282)
	... 13 more
Caused by: java.lang.InterruptedException
	at java.lang.Object.wait(Native Method)
	at hudson.remoting.FastPipedInputStream.read(FastPipedInputStream.java:175)
	... 21 more
Recording test results
ERROR: Step ?Publish JUnit test result report? failed: No test report files were found. Configuration error?