You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@phoenix.apache.org by Dequn Zhang <d....@gmail.com> on 2016/11/22 03:18:03 UTC

Spark 2.0.2 thin conn to Phoenix 4.8.1 error

Hello, since spark 2.x can not use Phoenix Spark Interpreter to load data,
so I want to use JDBC, but when I want to get a *thin connection*, I got
the following Error info while using *direct connection is ok* , I ran it
in spark-shell, scala 2.11.8, so can anyone give a solution?

   Phoenix : 4.8.1-HBase-1.2

scala>
> val jdbcDf = spark.read
> .format("jdbc")
> .option("driver","org.apache.phoenix.queryserver.client.Driver")
> .option("url","jdbc:phoenix:thin:url=http://192.168.6.131:87
> 65;serialization=PROTOBUF")
> .option("dbtable","imos")
> .load()
>
> java.sql.SQLException: While closing connection
>   at org.apache.calcite.avatica.Helper.createException(Helper.java:39)
>   at org.apache.calcite.avatica.AvaticaConnection.close(AvaticaCo
> nnection.java:156)
>   at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.
> resolveTable(JDBCRDD.scala:167)
>   at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation
> .<init>(JDBCRelation.scala:117)
>   at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelation
> Provider.createRelation(JdbcRelationProvider.scala:53)
>   at org.apache.spark.sql.execution.datasources.DataSource.
> resolveRelation(DataSource.scala:345)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:149)
>   at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
>   ... 53 elided
> Caused by: java.lang.RuntimeException: response code 500
>   at org.apache.calcite.avatica.remote.RemoteService.apply(Remote
> Service.java:45)
>   at org.apache.calcite.avatica.remote.JsonService.apply(JsonServ
> ice.java:227)
>   at org.apache.calcite.avatica.remote.RemoteMeta.closeConnection
> (RemoteMeta.java:78)
>   at org.apache.calcite.avatica.AvaticaConnection.close(AvaticaCo
> nnection.java:153)
>   ... 59 more