You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@phoenix.apache.org by Apache Jenkins Server <je...@builds.apache.org> on 2015/03/26 20:41:51 UTC
Build failed in Jenkins: Phoenix | Master #642
See <https://builds.apache.org/job/Phoenix-master/642/changes>
Changes:
[twdsilva] Fix IndexExpressionIT test failures
------------------------------------------
[...truncated 3892 lines...]
3823/4437 KB
3827/4437 KB
3831/4437 KB
3835/4437 KB
3839/4437 KB
3843/4437 KB
3847/4437 KB
3851/4437 KB
3855/4437 KB
3858/4437 KB
3862/4437 KB
3866/4437 KB
3870/4437 KB
3874/4437 KB
3878/4437 KB
3882/4437 KB
3886/4437 KB
3890/4437 KB
3894/4437 KB
3897/4437 KB
3901/4437 KB
3905/4437 KB
3909/4437 KB
3913/4437 KB
3917/4437 KB
3921/4437 KB
3925/4437 KB
3929/4437 KB
3933/4437 KB
3937/4437 KB
3941/4437 KB
3944/4437 KB
3948/4437 KB
3952/4437 KB
3956/4437 KB
3960/4437 KB
3964/4437 KB
3968/4437 KB
3972/4437 KB
3976/4437 KB
3980/4437 KB
3983/4437 KB
3987/4437 KB
3991/4437 KB
3995/4437 KB
3999/4437 KB
4003/4437 KB
4007/4437 KB
4011/4437 KB
4015/4437 KB
4019/4437 KB
4022/4437 KB
4026/4437 KB
4030/4437 KB
4034/4437 KB
4038/4437 KB
4042/4437 KB
4046/4437 KB
4050/4437 KB
4054/4437 KB
4058/4437 KB
4062/4437 KB
4066/4437 KB
4069/4437 KB
4073/4437 KB
4077/4437 KB
4081/4437 KB
4085/4437 KB
4089/4437 KB
4093/4437 KB
4097/4437 KB
4101/4437 KB
4105/4437 KB
4108/4437 KB
4112/4437 KB
4116/4437 KB
4120/4437 KB
4124/4437 KB
4128/4437 KB
4132/4437 KB
4136/4437 KB
4140/4437 KB
4144/4437 KB
4147/4437 KB
4151/4437 KB
4155/4437 KB
4159/4437 KB
4163/4437 KB
4167/4437 KB
4171/4437 KB
4175/4437 KB
4179/4437 KB
4183/4437 KB
4187/4437 KB
4191/4437 KB
4194/4437 KB
4198/4437 KB
4202/4437 KB
4206/4437 KB
4210/4437 KB
4214/4437 KB
4218/4437 KB
4222/4437 KB
4226/4437 KB
4230/4437 KB
4233/4437 KB
4237/4437 KB
4241/4437 KB
4245/4437 KB
4249/4437 KB
4253/4437 KB
4257/4437 KB
4261/4437 KB
4265/4437 KB
4269/4437 KB
4272/4437 KB
4276/4437 KB
4280/4437 KB
4284/4437 KB
4288/4437 KB
4292/4437 KB
4296/4437 KB
4300/4437 KB
4304/4437 KB
4308/4437 KB
4312/4437 KB
4316/4437 KB
4319/4437 KB
4323/4437 KB
4327/4437 KB
4331/4437 KB
4335/4437 KB
4339/4437 KB
4343/4437 KB
4347/4437 KB
4351/4437 KB
4355/4437 KB
4358/4437 KB
4362/4437 KB
4366/4437 KB
4370/4437 KB
4374/4437 KB
4378/4437 KB
4382/4437 KB
4386/4437 KB
4390/4437 KB
4394/4437 KB
4397/4437 KB
4401/4437 KB
4405/4437 KB
4409/4437 KB
4413/4437 KB
4417/4437 KB
4421/4437 KB
4425/4437 KB
4429/4437 KB
4433/4437 KB
4437/4437 KB
4437/4437 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/hbase-server-1.0.1-20150326.011647-2-tests.jar (4437 KB at 4189.5 KB/sec)
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix-core ---
[INFO] Deleting <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target>
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ phoenix-core ---
[INFO] Test Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/it/java> added.
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-resource (add-test-resource) @ phoenix-core ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ phoenix-core ---
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/generated-sources/antlr3> added.
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3> added.
[INFO]
[INFO] --- antlr3-maven-plugin:3.5:antlr (default) @ phoenix-core ---
[INFO] ANTLR: Processing source directory <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3>
ANTLR Parser Generator Version 3.5
Output file <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/generated-sources/antlr3/org/apache/phoenix/parse/PhoenixSQLParser.java> does not exist: must build <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3/PhoenixSQL.g>
PhoenixSQL.g
[INFO]
[INFO] --- maven-dependency-plugin:2.1:build-classpath (create-phoenix-generated-classpath) @ phoenix-core ---
[INFO] Wrote classpath file '<https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/cached_classpath.txt'.>
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix-core ---
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase/1.0.1-SNAPSHOT/maven-metadata.xml
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ phoenix-core ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource to META-INF/services
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ phoenix-core ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 785 source files to <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/classes>
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING :
[INFO] -------------------------------------------------------------
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/write/ParallelWriterIndexCommitter.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/write/ParallelWriterIndexCommitter.java>: Recompile with -Xlint:deprecation for details.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/parse/FunctionParseNode.java>: Some input files use unchecked or unsafe operations.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/parse/FunctionParseNode.java>: Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[119,49] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[143,39] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[INFO] 2 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Phoenix .................................... SUCCESS [2.058s]
[INFO] Phoenix Core ...................................... FAILURE [1:21.128s]
[INFO] Phoenix - Flume ................................... SKIPPED
[INFO] Phoenix - Pig ..................................... SKIPPED
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] Phoenix - Pherf ................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 1:23.613s
[INFO] Finished at: Thu Mar 26 19:39:42 UTC 2015
[INFO] Final Memory: 48M/726M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project phoenix-core: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[119,49] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[143,39] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Sending artifact delta relative to Phoenix | Master #628
Archived 73 artifacts
Archive block size is 32768
Received 1120 blocks and 101388201 bytes
Compression is 26.6%
Took 2 min 8 sec
Recording test results
Jenkins build is back to normal : Phoenix | Master #646
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-master/646/changes>
Build failed in Jenkins: Phoenix | Master #645
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-master/645/changes>
Changes:
[jamestaylor] PHOENIX-1783 Fix IDE compiler errors for JodaTimezoneCacheTest
------------------------------------------
[...truncated 87 lines...]
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase/1.0.1-SNAPSHOT/maven-metadata.xml (2 KB at 4.5 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB 3/3 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 12.1 KB/sec)
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-common/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 13.8 KB/sec)
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 32.5 KB/sec)
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-annotations/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 32.9 KB/sec)
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB 3/3 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 11.4 KB/sec)
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-protocol/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 12.4 KB/sec)
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 26.6 KB/sec)
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-client/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 24.4 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
3/3 KB 3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 5.0 KB/sec)
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-server/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 3.6 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB 3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 5.6 KB/sec)
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-prefix-tree/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 5.1 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
3/3 KB 3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 4.8 KB/sec)
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop-compat/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 3.4 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml
3/3 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 3.9 KB/sec)
3/3 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-hadoop2-compat/1.0.1-SNAPSHOT/maven-metadata.xml (3 KB at 2.6 KB/sec)
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml
2/2 KB
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml (2 KB at 2.3 KB/sec)
2/2 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-it/1.0.1-SNAPSHOT/maven-metadata.xml (2 KB at 1.6 KB/sec)
Downloading: http://conjars.org/repo/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://oss.sonatype.org/content/repositories/snapshots/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/releases/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://repository.apache.org/snapshots/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml
2/2 KB
2/2 KB 2/2 KB
Downloaded: http://repository.apache.org/snapshots/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml (2 KB at 3.4 KB/sec)
Downloaded: https://repository.apache.org/content/repositories/snapshots/org/apache/hbase/hbase-shell/1.0.1-SNAPSHOT/maven-metadata.xml (2 KB at 3.3 KB/sec)
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ phoenix-core ---
[INFO] Deleting <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target>
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-source (add-test-source) @ phoenix-core ---
[INFO] Test Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/it/java> added.
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-test-resource (add-test-resource) @ phoenix-core ---
[INFO]
[INFO] --- build-helper-maven-plugin:1.9.1:add-source (add-source) @ phoenix-core ---
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/generated-sources/antlr3> added.
[INFO] Source directory: <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3> added.
[INFO]
[INFO] --- antlr3-maven-plugin:3.5:antlr (default) @ phoenix-core ---
[INFO] ANTLR: Processing source directory <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3>
ANTLR Parser Generator Version 3.5
Output file <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/generated-sources/antlr3/org/apache/phoenix/parse/PhoenixSQLParser.java> does not exist: must build <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/antlr3/PhoenixSQL.g>
PhoenixSQL.g
[INFO]
[INFO] --- maven-dependency-plugin:2.1:build-classpath (create-phoenix-generated-classpath) @ phoenix-core ---
[INFO] Wrote classpath file '<https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/cached_classpath.txt'.>
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ phoenix-core ---
Downloading: http://people.apache.org/~garyh/mvn/org/apache/hbase/hbase/1.0.1-SNAPSHOT/maven-metadata.xml
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ phoenix-core ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 1 resource to META-INF/services
[INFO] Copying 3 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.0:compile (default-compile) @ phoenix-core ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 789 source files to <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/target/classes>
[INFO] -------------------------------------------------------------
[WARNING] COMPILATION WARNING :
[INFO] -------------------------------------------------------------
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/write/ParallelWriterIndexCommitter.java>: Some input files use or override a deprecated API.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/hbase/index/write/ParallelWriterIndexCommitter.java>: Recompile with -Xlint:deprecation for details.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/parse/FunctionParseNode.java>: Some input files use unchecked or unsafe operations.
[WARNING] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/parse/FunctionParseNode.java>: Recompile with -Xlint:unchecked for details.
[INFO] 4 warnings
[INFO] -------------------------------------------------------------
[INFO] -------------------------------------------------------------
[ERROR] COMPILATION ERROR :
[INFO] -------------------------------------------------------------
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[119,49] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[143,39] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[INFO] 2 errors
[INFO] -------------------------------------------------------------
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] Apache Phoenix .................................... SUCCESS [2.064s]
[INFO] Phoenix Core ...................................... FAILURE [30.347s]
[INFO] Phoenix - Flume ................................... SKIPPED
[INFO] Phoenix - Pig ..................................... SKIPPED
[INFO] Phoenix Assembly .................................. SKIPPED
[INFO] Phoenix - Pherf ................................... SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 32.845s
[INFO] Finished at: Sat Mar 28 00:14:08 UTC 2015
[INFO] Final Memory: 45M/665M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.0:compile (default-compile) on project phoenix-core: Compilation failure: Compilation failure:
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[119,49] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] <https://builds.apache.org/job/Phoenix-master/ws/phoenix-core/src/main/java/org/apache/phoenix/trace/util/Tracing.java>:[143,39] SAMPLER_FRACTION_CONF_KEY has private access in org.apache.htrace.impl.ProbabilitySampler
[ERROR] -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR]
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR] mvn <goals> -rf :phoenix-core
Build step 'Invoke top-level Maven targets' marked build as failure
Archiving artifacts
Sending artifact delta relative to Phoenix | Master #628
Archived 73 artifacts
Archive block size is 32768
Received 1120 blocks and 101388201 bytes
Compression is 26.6%
Took 1 min 0 sec
Updating PHOENIX-1783
Recording test results
Build failed in Jenkins: Phoenix | Master #644
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-master/644/changes>
Changes:
[samarth.jain] PHOENIX-1722 Speedup CONVERT_TZ function (Vaclav Loffelmann)
------------------------------------------
[...truncated 821 lines...]
at org.apache.phoenix.end2end.ViewIT.testNonSaltedUpdatableViewWithLocalIndex(ViewIT.java:113)
testNonSaltedUpdatableViewWithIndex(org.apache.phoenix.end2end.ViewIT) Time elapsed: 1.602 sec <<< FAILURE!
java.lang.AssertionError: expected:<6> but was:<1>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewIndex(BaseViewIT.java:121)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewWithIndex(BaseViewIT.java:54)
at org.apache.phoenix.end2end.ViewIT.testNonSaltedUpdatableViewWithIndex(ViewIT.java:108)
testReadOnlyOnReadOnlyView(org.apache.phoenix.end2end.ViewIT) Time elapsed: 0.698 sec <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<1>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at org.apache.phoenix.end2end.ViewIT.testReadOnlyView(ViewIT.java:65)
at org.apache.phoenix.end2end.ViewIT.testReadOnlyOnReadOnlyView(ViewIT.java:86)
testReadOnlyView(org.apache.phoenix.end2end.ViewIT) Time elapsed: 0.701 sec <<< FAILURE!
java.lang.AssertionError: expected:<4> but was:<1>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at org.apache.phoenix.end2end.ViewIT.testReadOnlyView(ViewIT.java:65)
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.852 sec - in org.apache.phoenix.hbase.index.covered.example.EndToEndCoveredIndexingIT
Running org.apache.phoenix.end2end.index.MutableIndexFailureIT
Tests run: 16, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 48.592 sec <<< FAILURE! - in org.apache.phoenix.end2end.TenantSpecificTablesDMLIT
testBasicUpsertSelect2(org.apache.phoenix.end2end.TenantSpecificTablesDMLIT) Time elapsed: 3.923 sec <<< FAILURE!
java.lang.AssertionError: expected:<3> but was:<1>
at org.junit.Assert.fail(Assert.java:88)
at org.junit.Assert.failNotEquals(Assert.java:834)
at org.junit.Assert.assertEquals(Assert.java:645)
at org.junit.Assert.assertEquals(Assert.java:631)
at org.apache.phoenix.end2end.TenantSpecificTablesDMLIT.testBasicUpsertSelect2(TenantSpecificTablesDMLIT.java:158)
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 49.453 sec - in org.apache.phoenix.hbase.index.covered.example.EndtoEndIndexingWithCompressionIT
Running org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Running org.apache.phoenix.end2end.index.MutableIndexReplicationIT
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 94.306 sec - in org.apache.phoenix.hbase.index.balancer.IndexLoadBalancerIT
Running org.apache.phoenix.end2end.index.DropIndexDuringUpsertIT
Running org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.67 sec - in org.apache.phoenix.end2end.index.ImmutableIndexWithStatsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 17.265 sec - in org.apache.phoenix.end2end.index.MutableIndexReplicationIT
Running org.apache.phoenix.end2end.KeyOnlyIT
Running org.apache.phoenix.end2end.StatsCollectorIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.691 sec - in org.apache.phoenix.end2end.KeyOnlyIT
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.378 sec - in org.apache.phoenix.end2end.TenantSpecificTablesDDLIT
Running org.apache.phoenix.end2end.QueryTimeoutIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.478 sec - in org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.6 sec - in org.apache.phoenix.end2end.QueryTimeoutIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.QueryWithLimitIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 52.114 sec - in org.apache.phoenix.end2end.StatsCollectorIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.287 sec - in org.apache.phoenix.end2end.QueryWithLimitIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.824 sec - in org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.SpillableGroupByIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.149 sec - in org.apache.phoenix.end2end.ParallelIteratorsIT
Tests run: 2, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 12.295 sec <<< FAILURE! - in org.apache.phoenix.end2end.SaltedViewIT
testSaltedUpdatableViewWithLocalIndex(org.apache.phoenix.end2end.SaltedViewIT) Time elapsed: 3.175 sec <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:262)
at java.util.concurrent.FutureTask.get(FutureTask.java:119)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:536)
at org.apache.phoenix.iterate.MergeSortResultIterator.getIterators(MergeSortResultIterator.java:48)
at org.apache.phoenix.iterate.MergeSortResultIterator.minIterator(MergeSortResultIterator.java:84)
at org.apache.phoenix.iterate.MergeSortResultIterator.next(MergeSortResultIterator.java:111)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:756)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewIndex(BaseViewIT.java:125)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewWithIndex(BaseViewIT.java:54)
at org.apache.phoenix.end2end.SaltedViewIT.testSaltedUpdatableViewWithLocalIndex(SaltedViewIT.java:39)
Caused by: org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.ScanningResultIterator.next(ScanningResultIterator.java:56)
at org.apache.phoenix.iterate.TableResultIterator.next(TableResultIterator.java:104)
at org.apache.phoenix.iterate.ChunkedResultIterator$SingleChunkResultIterator.next(ChunkedResultIterator.java:149)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:107)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:74)
at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:68)
at org.apache.phoenix.iterate.ChunkedResultIterator.<init>(ChunkedResultIterator.java:92)
at org.apache.phoenix.iterate.ChunkedResultIterator$ChunkedResultIteratorFactory.newIterator(ChunkedResultIterator.java:72)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:93)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:84)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.TableNotFoundException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:313)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:229)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:294)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:275)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:31751)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:294)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:275)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.573 sec - in org.apache.phoenix.end2end.SpillableGroupByIT
Running org.apache.phoenix.end2end.AlterTableIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 142.06 sec - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
Running org.apache.phoenix.end2end.StatsCollectorWithSplitsAndMultiCFIT
Running org.apache.phoenix.end2end.CountDistinctCompressionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.954 sec - in org.apache.phoenix.end2end.CountDistinctCompressionIT
Running org.apache.phoenix.mapreduce.IndexToolIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 30.392 sec - in org.apache.phoenix.end2end.StatsCollectorWithSplitsAndMultiCFIT
Running org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 213.538 sec - in org.apache.phoenix.mapreduce.IndexToolIT
Tests run: 51, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 282.946 sec - in org.apache.phoenix.end2end.AlterTableIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 255.929 sec - in org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Build timed out (after 120 minutes). Marking the build as failed.
Build was aborted
Archiving artifacts
Sending artifact delta relative to Phoenix | Master #628
Archived 1003 artifacts
Archive block size is 32768
Received 6594 blocks and 758839963 bytes
Compression is 22.2%
Took 4 min 39 sec
Updating PHOENIX-1722
Recording test results
Build failed in Jenkins: Phoenix | Master #643
Posted by Apache Jenkins Server <je...@builds.apache.org>.
See <https://builds.apache.org/job/Phoenix-master/643/changes>
Changes:
[thomas] PHOENIX-1457 Use high priority queue for metadata endpoint calls
------------------------------------------
[...truncated 4809 lines...]
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:743)
at org.apache.phoenix.query.BaseTest.createTestTable(BaseTest.java:725)
at org.apache.phoenix.query.BaseTest.ensureTableCreated(BaseTest.java:717)
at org.apache.phoenix.end2end.KeyOnlyIT.testQueryWithLimitAndStats(KeyOnlyIT.java:167)
testQueryWithLimitAndStats(org.apache.phoenix.end2end.KeyOnlyIT) Time elapsed: 0.001 sec <<< ERROR!
java.sql.SQLException: No suitable driver found for jdbc:phoenix:localhost:64182;test=true
at java.sql.DriverManager.getConnection(DriverManager.java:596)
at java.sql.DriverManager.getConnection(DriverManager.java:187)
at org.apache.phoenix.query.BaseTest.deletePriorTables(BaseTest.java:779)
at org.apache.phoenix.query.BaseTest.deletePriorTables(BaseTest.java:770)
at org.apache.phoenix.end2end.BaseOwnClusterClientManagedTimeIT.cleanUpAfterTest(BaseOwnClusterClientManagedTimeIT.java:29)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50)
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12)
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:33)
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78)
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.internal.runners.statements.RunBefores.evaluate(RunBefores.java:26)
at org.junit.internal.runners.statements.RunAfters.evaluate(RunAfters.java:27)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runners.Suite.runChild(Suite.java:128)
at org.junit.runners.Suite.runChild(Suite.java:27)
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290)
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71)
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288)
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58)
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268)
at org.junit.runners.ParentRunner.run(ParentRunner.java:363)
at org.junit.runner.JUnitCore.run(JUnitCore.java:137)
at org.junit.runner.JUnitCore.run(JUnitCore.java:115)
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.createRequestAndRun(JUnitCoreWrapper.java:107)
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.executeLazy(JUnitCoreWrapper.java:88)
at org.apache.maven.surefire.junitcore.JUnitCoreWrapper.execute(JUnitCoreWrapper.java:57)
at org.apache.maven.surefire.junitcore.JUnitCoreProvider.invoke(JUnitCoreProvider.java:144)
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:203)
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:155)
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:103)
Running org.apache.phoenix.end2end.StatsCollectorIT
Running org.apache.phoenix.end2end.ContextClassloaderIT
Running org.apache.phoenix.end2end.QueryTimeoutIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 8.805 sec - in org.apache.phoenix.end2end.ContextClassloaderIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.711 sec - in org.apache.phoenix.end2end.QueryTimeoutIT
Running org.apache.phoenix.end2end.MultiCfQueryExecIT
Running org.apache.phoenix.end2end.QueryWithLimitIT
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 22.184 sec - in org.apache.phoenix.end2end.MultiCfQueryExecIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 13.413 sec - in org.apache.phoenix.end2end.QueryWithLimitIT
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 50.684 sec - in org.apache.phoenix.end2end.StatsCollectorIT
Running org.apache.phoenix.end2end.SaltedViewIT
Running org.apache.phoenix.end2end.ParallelIteratorsIT
Running org.apache.phoenix.end2end.SpillableGroupByIT
Tests run: 2, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 12.599 sec <<< FAILURE! - in org.apache.phoenix.end2end.SaltedViewIT
testSaltedUpdatableViewWithLocalIndex(org.apache.phoenix.end2end.SaltedViewIT) Time elapsed: 2.867 sec <<< ERROR!
org.apache.phoenix.exception.PhoenixIOException: org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:262)
at java.util.concurrent.FutureTask.get(FutureTask.java:119)
at org.apache.phoenix.iterate.BaseResultIterators.getIterators(BaseResultIterators.java:536)
at org.apache.phoenix.iterate.MergeSortResultIterator.getIterators(MergeSortResultIterator.java:48)
at org.apache.phoenix.iterate.MergeSortResultIterator.minIterator(MergeSortResultIterator.java:84)
at org.apache.phoenix.iterate.MergeSortResultIterator.next(MergeSortResultIterator.java:111)
at org.apache.phoenix.jdbc.PhoenixResultSet.next(PhoenixResultSet.java:756)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewIndex(BaseViewIT.java:125)
at org.apache.phoenix.end2end.BaseViewIT.testUpdatableViewWithIndex(BaseViewIT.java:54)
at org.apache.phoenix.end2end.SaltedViewIT.testSaltedUpdatableViewWithLocalIndex(SaltedViewIT.java:39)
Caused by: org.apache.phoenix.exception.PhoenixIOException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at org.apache.phoenix.util.ServerUtil.parseServerException(ServerUtil.java:108)
at org.apache.phoenix.iterate.ScanningResultIterator.next(ScanningResultIterator.java:56)
at org.apache.phoenix.iterate.TableResultIterator.next(TableResultIterator.java:104)
at org.apache.phoenix.iterate.ChunkedResultIterator$SingleChunkResultIterator.next(ChunkedResultIterator.java:149)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:107)
at org.apache.phoenix.iterate.SpoolingResultIterator.<init>(SpoolingResultIterator.java:74)
at org.apache.phoenix.iterate.SpoolingResultIterator$SpoolingResultIteratorFactory.newIterator(SpoolingResultIterator.java:68)
at org.apache.phoenix.iterate.ChunkedResultIterator.<init>(ChunkedResultIterator.java:92)
at org.apache.phoenix.iterate.ChunkedResultIterator$ChunkedResultIteratorFactory.newIterator(ChunkedResultIterator.java:72)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:93)
at org.apache.phoenix.iterate.ParallelIterators$1.call(ParallelIterators.java:84)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at org.apache.phoenix.job.JobManager$InstrumentedJobFutureTask.run(JobManager.java:172)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.TableNotFoundException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.protobuf.ProtobufUtil.getRemoteException(ProtobufUtil.java:313)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:229)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:294)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:275)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException: org.apache.hadoop.hbase.TableNotFoundException: Table '_LOCAL_IDX__LOCAL_IDX_T' was not found, got: _LOCAL_IDX_T.
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegionInMeta(ConnectionManager.java:1225)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1109)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1093)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.locateRegion(ConnectionManager.java:1050)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.locateRegion(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getRegionLocation(ConnectionManager.java:885)
at org.apache.hadoop.hbase.client.CoprocessorHConnection.getRegionLocation(CoprocessorHConnection.java:41)
at org.apache.hadoop.hbase.client.RegionServerCallable.prepare(RegionServerCallable.java:78)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
at org.apache.hadoop.hbase.client.HTable.get(HTable.java:881)
at org.apache.hadoop.hbase.client.HTableWrapper.get(HTableWrapper.java:125)
at org.apache.phoenix.util.IndexUtil.wrapResultUsingOffset(IndexUtil.java:492)
at org.apache.phoenix.coprocessor.BaseScannerRegionObserver$2.nextRaw(BaseScannerRegionObserver.java:310)
at org.apache.phoenix.coprocessor.DelegateRegionScanner.nextRaw(DelegateRegionScanner.java:76)
at org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2101)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:31305)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2031)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:107)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:724)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1199)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:216)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:300)
at org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$BlockingStub.scan(ClientProtos.java:31751)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:199)
at org.apache.hadoop.hbase.client.ScannerCallable.call(ScannerCallable.java:62)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:294)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas$RetryingRPC.call(ScannerCallableWithReplicas.java:275)
at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:334)
at java.util.concurrent.FutureTask.run(FutureTask.java:166)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:724)
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 12.665 sec - in org.apache.phoenix.end2end.ParallelIteratorsIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 11.462 sec - in org.apache.phoenix.end2end.SpillableGroupByIT
Running org.apache.phoenix.end2end.StatsCollectorWithSplitsAndMultiCFIT
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 142.123 sec - in org.apache.phoenix.end2end.index.MutableIndexFailureIT
Running org.apache.phoenix.end2end.AlterTableIT
Running org.apache.phoenix.end2end.CountDistinctCompressionIT
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.569 sec - in org.apache.phoenix.end2end.CountDistinctCompressionIT
Running org.apache.phoenix.mapreduce.IndexToolIT
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 32.181 sec - in org.apache.phoenix.end2end.StatsCollectorWithSplitsAndMultiCFIT
Running org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 223.315 sec - in org.apache.phoenix.mapreduce.IndexToolIT
Tests run: 51, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 282.702 sec - in org.apache.phoenix.end2end.AlterTableIT
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 256.256 sec - in org.apache.phoenix.mapreduce.CsvBulkLoadToolIT
Build timed out (after 120 minutes). Marking the build as failed.
Build was aborted
Archiving artifacts
Sending artifact delta relative to Phoenix | Master #628
Archived 966 artifacts
Archive block size is 32768
Received 5859 blocks and 726255685 bytes
Compression is 20.9%
Took 4 min 31 sec
Updating PHOENIX-1457
Recording test results