You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kanagha Kumar <kp...@salesforce.com> on 2017/07/06 23:00:09 UTC

Spark 2.0.2 - JdbcRelationProvider does not allow create table as select

Hi,

I'm running spark 2.0.2 version and I'm noticing an issue with
DataFrameWriter.save()

Code:

ds.write().format("jdbc").mode("overwrite").options(ImmutableMap.of(

                "driver", "org.apache.phoenix.jdbc.PhoenixDriver",

                "url", urlWithTenant,

                "dbtable", "tableName")).save();


I found this was reported in prior spark versions but seems to have been
fixed with 2.0.x versions. Please let me know if this issue still exists
with 2.0.2 version.

17/07/06 15:53:12 ERROR ApplicationMaster: User class threw exception:
java.lang.RuntimeException:
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider does
not allow create table as select.
java.lang.RuntimeException:
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider does
not allow create table as select.
        at scala.sys.package$.error(package.scala:27)
        at
org.apache.spark.sql.execution.datasources.DataSource.write(DataSource.scala:530)
        at
org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:211)
        at TenantPhoenix.main(TenantPhoenix.java:64)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:498)
        at
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:627)
17/07/06 15:53:12 INFO ApplicationMaster: Final app status: FAILED,
exitCode: 15, (reason: User class threw exception:
java.lang.RuntimeException:
org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider does
not allow create table as select.)
17/07/06 15:53:12 INFO SparkContext: Invoking stop() from shutdown hook
17/07/06 15:53:12 INFO SparkUI: Stopped Spark web UI at
http://10.3.9.95:50461
17/07/06 15:53:12 INFO YarnAllocator: Driver requested a total number of 0
executor(s).
17/07/06 15:53:12 INFO YarnClusterSchedulerBackend: Shutting down all
executors
17/07/06 15:53:12 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking each
executor to shut down
17/07/06 15:53:12 INFO SchedulerExtensionServices: Stopping
SchedulerExtensionServices

Thanks

Re: Spark 2.0.2 - JdbcRelationProvider does not allow create table as select

Posted by Kanagha Kumar <kp...@salesforce.com>.
Hi all,

Bumping it again! Please let me know if anyone has faced this in 2.0.x
versions. I am using spark 2.0.2 for runtime. Based on the comments, I will
open a bug if necessary. Thanks!

On Thu, Jul 6, 2017 at 4:00 PM, Kanagha Kumar <kp...@salesforce.com>
wrote:

> Hi,
>
> I'm running spark 2.0.2 version and I'm noticing an issue with
> DataFrameWriter.save()
>
> Code:
>
> ds.write().format("jdbc").mode("overwrite").options(ImmutableMap.of(
>
>                 "driver", "org.apache.phoenix.jdbc.PhoenixDriver",
>
>                 "url", urlWithTenant,
>
>                 "dbtable", "tableName")).save();
>
>
> I found this was reported in prior spark versions but seems to have been
> fixed with 2.0.x versions. Please let me know if this issue still exists
> with 2.0.2 version.
>
> 17/07/06 15:53:12 ERROR ApplicationMaster: User class threw exception:
> java.lang.RuntimeException: org.apache.spark.sql.
> execution.datasources.jdbc.JdbcRelationProvider does not allow create
> table as select.
> java.lang.RuntimeException: org.apache.spark.sql.
> execution.datasources.jdbc.JdbcRelationProvider does not allow create
> table as select.
>         at scala.sys.package$.error(package.scala:27)
>         at org.apache.spark.sql.execution.datasources.
> DataSource.write(DataSource.scala:530)
>         at org.apache.spark.sql.DataFrameWriter.save(
> DataFrameWriter.scala:211)
>         at TenantPhoenix.main(TenantPhoenix.java:64)
>         at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>         at sun.reflect.NativeMethodAccessorImpl.invoke(
> NativeMethodAccessorImpl.java:62)
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(
> DelegatingMethodAccessorImpl.java:43)
>         at java.lang.reflect.Method.invoke(Method.java:498)
>         at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(
> ApplicationMaster.scala:627)
> 17/07/06 15:53:12 INFO ApplicationMaster: Final app status: FAILED,
> exitCode: 15, (reason: User class threw exception:
> java.lang.RuntimeException: org.apache.spark.sql.
> execution.datasources.jdbc.JdbcRelationProvider does not allow create
> table as select.)
> 17/07/06 15:53:12 INFO SparkContext: Invoking stop() from shutdown hook
> 17/07/06 15:53:12 INFO SparkUI: Stopped Spark web UI at
> http://10.3.9.95:50461
> 17/07/06 15:53:12 INFO YarnAllocator: Driver requested a total number of 0
> executor(s).
> 17/07/06 15:53:12 INFO YarnClusterSchedulerBackend: Shutting down all
> executors
> 17/07/06 15:53:12 INFO YarnSchedulerBackend$YarnDriverEndpoint: Asking
> each executor to shut down
> 17/07/06 15:53:12 INFO SchedulerExtensionServices: Stopping
> SchedulerExtensionServices
>
> Thanks
>
>