You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Nathan Davis <na...@salesforce.com> on 2016/08/09 18:20:52 UTC

TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark

I am trying to create a simple POC of the Spark / Phoenix integration. The
operation is:

val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))


The error I get from that is:

org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03): Table
> undefined. tableName=SYSTEM.CATALOG

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllTableRegions(ConnectionQueryServicesImpl.java:542)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1113)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1033)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1369)

at
> org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(DelegateConnectionQueryServices.java:120)

at
> org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2116)

at
> org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:828)

at
> org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:183)

at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:338)

at
> org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:326)

at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)

at
> org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:324)

at
> org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1326)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2279)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(ConnectionQueryServicesImpl.java:2248)

at
> org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:78)

at
> org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2248)

at
> org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:233)

at
> org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:135)

at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)

at java.sql.DriverManager.getConnection(DriverManager.java:664)

at java.sql.DriverManager.getConnection(DriverManager.java:208)

at
> org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnection(ConnectionUtil.java:98)


This is in a spark-shell session started with command:

spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/
> phoenix-4.7.0-HBase-1.2-client.jar



Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG clearly
exists and has the table metadata I'd expect.

What am I doing wrong here?

Thanks,
-nathan

Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark

Posted by Nathan Davis <na...@salesforce.com>.
When you choose to include Phoenix in the EMR deployment it puts Amazon's
build of all the jars and bin scripts under /usr/lib/phoenix/

Using sqlline, psql, etc on the EMR master server work fine. But, the Spark
cluster I'm trying to connect to HBase from is not part of that EMR
deployment. It is separate.

I think what I'm going to try (since we are just working on a POC) is to
just fire up another EMR deployment that includes both HBase/Phoenix and
Spark together. This may make the integration easier. Obviously that
doesn't solve the underlying problem, but it gets me to a working POC
faster.


On Wed, Aug 10, 2016 at 2:01 PM, Josh Mahonin <jm...@gmail.com> wrote:

> Hi Nathan,
>
> That could very well be the issue, I suspect they're running a local fork
> if it's on top of HBase 1.2.
>
> I'm not familiar with the EMR environment, when you use sqlline.py is it
> using their own Phoenix JARs or your own? If it's theirs, perhaps the
> phoenix-client-spark JAR might be available in the environment as as well.
> The 'Phoenix Clients' [1] page suggests that there may be a Phoenix
> installation at /home/hadoop/usr/lib/phoenix
>
> Good luck,
>
> Josh
>
> [1] http://docs.aws.amazon.com/ElasticMapReduce/latest/
> ReleaseGuide/emr-phoenix.html#d0e18597
>
> On Wed, Aug 10, 2016 at 9:07 AM, Nathan Davis <nathan.davis@salesforce.com
> > wrote:
>
>> Thanks Josh, I tried that out (adding just the phoenix-client-spark jar
>> to CP) and got the same error result.
>>
>> I wonder if the issue is that I'm running on EMR 5 with HBase 1.2. The
>> jars I'm using are copied over from the HBase master because there is no
>> 4.7.0-HBase-1.2 set in MVN. Is the phoenix-spark functionality confirmed to
>> work in 4.7 against HBase 1.2?
>>
>>
>> On Tue, Aug 9, 2016 at 7:37 PM, Josh Mahonin <jm...@gmail.com> wrote:
>>
>>> Hi Nathan,
>>>
>>> That's a new error to me. I've heard some people have had some luck
>>> passing the phoenix-spark and phoenix-client JAR using the --jars option,
>>> but the recommended procedure is to ensure you're using the
>>> *phoenix-client-spark* JAR on the Spark driver and executor classpath
>>> from config. [1]
>>>
>>> As a reference, here's a Docker image with a working configuration as
>>> well [2]
>>>
>>> Good luck,
>>>
>>> Josh
>>>
>>> [1] https://phoenix.apache.org/phoenix_spark.html
>>> [2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark
>>>
>>> On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <
>>> nathan.davis@salesforce.com> wrote:
>>>
>>>> I am trying to create a simple POC of the Spark / Phoenix integration.
>>>> The operation is:
>>>>
>>>> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>>>>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>>>>
>>>>
>>>> The error I get from that is:
>>>>
>>>> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>>>>> Table undefined. tableName=SYSTEM.CATALOG
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>>>> ableRegions(ConnectionQueryServicesImpl.java:542)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkCl
>>>>> ientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureT
>>>>> ableCreated(ConnectionQueryServicesImpl.java:1033)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createT
>>>>> able(ConnectionQueryServicesImpl.java:1369)
>>>>
>>>> at org.apache.phoenix.query.DelegateConnectionQueryServices.cre
>>>>> ateTable(DelegateConnectionQueryServices.java:120)
>>>>
>>>> at org.apache.phoenix.schema.MetaDataClient.createTableInternal
>>>>> (MetaDataClient.java:2116)
>>>>
>>>> at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDat
>>>>> aClient.java:828)
>>>>
>>>> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(Cre
>>>>> ateTableCompiler.java:183)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>>>> ment.java:338)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>>>> ment.java:326)
>>>>
>>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho
>>>>> enixStatement.java:324)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(Phoen
>>>>> ixStatement.java:1326)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>>>> (ConnectionQueryServicesImpl.java:2279)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>>>> (ConnectionQueryServicesImpl.java:2248)
>>>>
>>>> at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixC
>>>>> ontextExecutor.java:78)
>>>>
>>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Co
>>>>> nnectionQueryServicesImpl.java:2248)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServ
>>>>> ices(PhoenixDriver.java:233)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnecti
>>>>> on(PhoenixEmbeddedDriver.java:135)
>>>>
>>>> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.
>>>>> java:202)
>>>>
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>>
>>>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>>
>>>> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnecti
>>>>> on(ConnectionUtil.java:98)
>>>>
>>>>
>>>> This is in a spark-shell session started with command:
>>>>
>>>> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>>>>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/phoe
>>>>> nix-4.7.0-HBase-1.2-client.jar
>>>>
>>>>
>>>>
>>>> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
>>>> clearly exists and has the table metadata I'd expect.
>>>>
>>>> What am I doing wrong here?
>>>>
>>>> Thanks,
>>>> -nathan
>>>>
>>>>
>>>>
>>>
>>
>

Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark

Posted by Josh Mahonin <jm...@gmail.com>.
Hi Nathan,

That could very well be the issue, I suspect they're running a local fork
if it's on top of HBase 1.2.

I'm not familiar with the EMR environment, when you use sqlline.py is it
using their own Phoenix JARs or your own? If it's theirs, perhaps the
phoenix-client-spark JAR might be available in the environment as as well.
The 'Phoenix Clients' [1] page suggests that there may be a Phoenix
installation at /home/hadoop/usr/lib/phoenix

Good luck,

Josh

[1]
http://docs.aws.amazon.com/ElasticMapReduce/latest/ReleaseGuide/emr-phoenix.html#d0e18597

On Wed, Aug 10, 2016 at 9:07 AM, Nathan Davis <na...@salesforce.com>
wrote:

> Thanks Josh, I tried that out (adding just the phoenix-client-spark jar to
> CP) and got the same error result.
>
> I wonder if the issue is that I'm running on EMR 5 with HBase 1.2. The
> jars I'm using are copied over from the HBase master because there is no
> 4.7.0-HBase-1.2 set in MVN. Is the phoenix-spark functionality confirmed to
> work in 4.7 against HBase 1.2?
>
>
> On Tue, Aug 9, 2016 at 7:37 PM, Josh Mahonin <jm...@gmail.com> wrote:
>
>> Hi Nathan,
>>
>> That's a new error to me. I've heard some people have had some luck
>> passing the phoenix-spark and phoenix-client JAR using the --jars option,
>> but the recommended procedure is to ensure you're using the
>> *phoenix-client-spark* JAR on the Spark driver and executor classpath
>> from config. [1]
>>
>> As a reference, here's a Docker image with a working configuration as
>> well [2]
>>
>> Good luck,
>>
>> Josh
>>
>> [1] https://phoenix.apache.org/phoenix_spark.html
>> [2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark
>>
>> On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <nathan.davis@salesforce.com
>> > wrote:
>>
>>> I am trying to create a simple POC of the Spark / Phoenix integration.
>>> The operation is:
>>>
>>> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>>>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>>>
>>>
>>> The error I get from that is:
>>>
>>> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>>>> Table undefined. tableName=SYSTEM.CATALOG
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>>> ableRegions(ConnectionQueryServicesImpl.java:542)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkCl
>>>> ientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureT
>>>> ableCreated(ConnectionQueryServicesImpl.java:1033)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createT
>>>> able(ConnectionQueryServicesImpl.java:1369)
>>>
>>> at org.apache.phoenix.query.DelegateConnectionQueryServices.cre
>>>> ateTable(DelegateConnectionQueryServices.java:120)
>>>
>>> at org.apache.phoenix.schema.MetaDataClient.createTableInternal
>>>> (MetaDataClient.java:2116)
>>>
>>> at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDat
>>>> aClient.java:828)
>>>
>>> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(Cre
>>>> ateTableCompiler.java:183)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>>> ment.java:338)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>>> ment.java:326)
>>>
>>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho
>>>> enixStatement.java:324)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(Phoen
>>>> ixStatement.java:1326)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>>> (ConnectionQueryServicesImpl.java:2279)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call
>>>> (ConnectionQueryServicesImpl.java:2248)
>>>
>>> at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixC
>>>> ontextExecutor.java:78)
>>>
>>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Co
>>>> nnectionQueryServicesImpl.java:2248)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServ
>>>> ices(PhoenixDriver.java:233)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnecti
>>>> on(PhoenixEmbeddedDriver.java:135)
>>>
>>> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>>>
>>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>>
>>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>>
>>> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnecti
>>>> on(ConnectionUtil.java:98)
>>>
>>>
>>> This is in a spark-shell session started with command:
>>>
>>> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>>>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/phoe
>>>> nix-4.7.0-HBase-1.2-client.jar
>>>
>>>
>>>
>>> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
>>> clearly exists and has the table metadata I'd expect.
>>>
>>> What am I doing wrong here?
>>>
>>> Thanks,
>>> -nathan
>>>
>>>
>>>
>>
>

Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark

Posted by Nathan Davis <na...@salesforce.com>.
Thanks Josh, I tried that out (adding just the phoenix-client-spark jar to
CP) and got the same error result.

I wonder if the issue is that I'm running on EMR 5 with HBase 1.2. The jars
I'm using are copied over from the HBase master because there is no
4.7.0-HBase-1.2 set in MVN. Is the phoenix-spark functionality confirmed to
work in 4.7 against HBase 1.2?


On Tue, Aug 9, 2016 at 7:37 PM, Josh Mahonin <jm...@gmail.com> wrote:

> Hi Nathan,
>
> That's a new error to me. I've heard some people have had some luck
> passing the phoenix-spark and phoenix-client JAR using the --jars option,
> but the recommended procedure is to ensure you're using the
> *phoenix-client-spark* JAR on the Spark driver and executor classpath
> from config. [1]
>
> As a reference, here's a Docker image with a working configuration as well
> [2]
>
> Good luck,
>
> Josh
>
> [1] https://phoenix.apache.org/phoenix_spark.html
> [2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark
>
> On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <na...@salesforce.com>
> wrote:
>
>> I am trying to create a simple POC of the Spark / Phoenix integration.
>> The operation is:
>>
>> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>>
>>
>> The error I get from that is:
>>
>> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>>> Table undefined. tableName=SYSTEM.CATALOG
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.getAllT
>>> ableRegions(ConnectionQueryServicesImpl.java:542)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkCl
>>> ientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureT
>>> ableCreated(ConnectionQueryServicesImpl.java:1033)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createT
>>> able(ConnectionQueryServicesImpl.java:1369)
>>
>> at org.apache.phoenix.query.DelegateConnectionQueryServices.
>>> createTable(DelegateConnectionQueryServices.java:120)
>>
>> at org.apache.phoenix.schema.MetaDataClient.createTableInternal
>>> (MetaDataClient.java:2116)
>>
>> at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDat
>>> aClient.java:828)
>>
>> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(Cre
>>> ateTableCompiler.java:183)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:338)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixState
>>> ment.java:326)
>>
>> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(Pho
>>> enixStatement.java:324)
>>
>> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(Phoen
>>> ixStatement.java:1326)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.
>>> call(ConnectionQueryServicesImpl.java:2279)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.
>>> call(ConnectionQueryServicesImpl.java:2248)
>>
>> at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixC
>>> ontextExecutor.java:78)
>>
>> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(Co
>>> nnectionQueryServicesImpl.java:2248)
>>
>> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServ
>>> ices(PhoenixDriver.java:233)
>>
>> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnecti
>>> on(PhoenixEmbeddedDriver.java:135)
>>
>> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>>
>> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>>
>> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>>
>> at org.apache.phoenix.mapreduce.util.ConnectionUtil.getConnecti
>>> on(ConnectionUtil.java:98)
>>
>>
>> This is in a spark-shell session started with command:
>>
>> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/phoe
>>> nix-4.7.0-HBase-1.2-client.jar
>>
>>
>>
>> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
>> clearly exists and has the table metadata I'd expect.
>>
>> What am I doing wrong here?
>>
>> Thanks,
>> -nathan
>>
>>
>>
>

Re: TableNotFoundException, tableName=SYSTEM.CATALOG with phoenix-spark

Posted by Josh Mahonin <jm...@gmail.com>.
Hi Nathan,

That's a new error to me. I've heard some people have had some luck passing
the phoenix-spark and phoenix-client JAR using the --jars option, but the
recommended procedure is to ensure you're using the *phoenix-client-spark*
JAR on the Spark driver and executor classpath from config. [1]

As a reference, here's a Docker image with a working configuration as well
[2]

Good luck,

Josh

[1] https://phoenix.apache.org/phoenix_spark.html
[2] https://github.com/jmahonin/docker-phoenix/tree/phoenix_spark

On Tue, Aug 9, 2016 at 2:20 PM, Nathan Davis <na...@salesforce.com>
wrote:

> I am trying to create a simple POC of the Spark / Phoenix integration. The
> operation is:
>
> val df = sqlContext.load("org.apache.phoenix.spark", Map("table" ->
>> "SIMPLE_TABLE", "zkUrl" -> "some-name:2181"))
>
>
> The error I get from that is:
>
> org.apache.phoenix.schema.TableNotFoundException: ERROR 1012 (42M03):
>> Table undefined. tableName=SYSTEM.CATALOG
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> getAllTableRegions(ConnectionQueryServicesImpl.java:542)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1113)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.
>> ensureTableCreated(ConnectionQueryServicesImpl.java:1033)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(
>> ConnectionQueryServicesImpl.java:1369)
>
> at org.apache.phoenix.query.DelegateConnectionQueryServices.createTable(
>> DelegateConnectionQueryServices.java:120)
>
> at org.apache.phoenix.schema.MetaDataClient.createTableInternal(
>> MetaDataClient.java:2116)
>
> at org.apache.phoenix.schema.MetaDataClient.createTable(
>> MetaDataClient.java:828)
>
> at org.apache.phoenix.compile.CreateTableCompiler$2.execute(
>> CreateTableCompiler.java:183)
>
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(
>> PhoenixStatement.java:338)
>
> at org.apache.phoenix.jdbc.PhoenixStatement$2.call(
>> PhoenixStatement.java:326)
>
> at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53)
>
> at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(
>> PhoenixStatement.java:324)
>
> at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(
>> PhoenixStatement.java:1326)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>> ConnectionQueryServicesImpl.java:2279)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl$13.call(
>> ConnectionQueryServicesImpl.java:2248)
>
> at org.apache.phoenix.util.PhoenixContextExecutor.call(
>> PhoenixContextExecutor.java:78)
>
> at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(
>> ConnectionQueryServicesImpl.java:2248)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(
>> PhoenixDriver.java:233)
>
> at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(
>> PhoenixEmbeddedDriver.java:135)
>
> at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:202)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:664)
>
> at java.sql.DriverManager.getConnection(DriverManager.java:208)
>
> at org.apache.phoenix.mapreduce.util.ConnectionUtil.
>> getConnection(ConnectionUtil.java:98)
>
>
> This is in a spark-shell session started with command:
>
> spark-shell --packages com.databricks:spark-csv_2.10:1.4.0 --jars
>> /root/jars/phoenix-spark-4.7.0-HBase-1.2.jar,/root/jars/pho
>> enix-4.7.0-HBase-1.2-client.jar
>
>
>
> Using both sqlline.py and hbase shell I can see that SYSTEM.CATALOG
> clearly exists and has the table metadata I'd expect.
>
> What am I doing wrong here?
>
> Thanks,
> -nathan
>
>
>