You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2016/07/29 21:27:20 UTC

[jira] [Closed] (SPARK-16789) Can't run saveAsTable with database name

     [ https://issues.apache.org/jira/browse/SPARK-16789?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li closed SPARK-16789.
---------------------------
    Resolution: Duplicate

> Can't run saveAsTable with database name
> ----------------------------------------
>
>                 Key: SPARK-16789
>                 URL: https://issues.apache.org/jira/browse/SPARK-16789
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>         Environment: CentOS 7 JDK 1.8 Hive 1.2.1
>            Reporter: SonixLegend
>
> The function "saveAsTable" with database and table name is running via 1.6.2 successfully. But when I upgrade 2.0, it's got the error. There are my code and error message. Can you help me?
> conf/hive-site.xml
> <property>
>         <name>hive.metastore.uris</name>
>         <value>thrift://localhost:9083</value>
> </property>
> val spark = SparkSession.builder().appName("SparkHive").enableHiveSupport().getOrCreate()
> import spark.implicits._
> import spark.sql
> val source = sql("select * from sample.sample")
> source.createOrReplaceTempView("test")
> source.collect.foreach{tuple => println(tuple(0) + ":" + tuple(1))}
> val target = sql("select key, 'Spark' as value from test")
> println(target.count())
> target.write.mode(SaveMode.Append).saveAsTable("sample.sample")
> spark.stop()
> Exception in thread "main" org.apache.spark.sql.AnalysisException: Saving data in MetastoreRelation sample, sample
>  is not supported.;
> 	at org.apache.spark.sql.execution.command.CreateDataSourceTableAsSelectCommand.run(createDataSourceTables.scala:218)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:60)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:58)
> 	at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:74)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:115)
> 	at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:136)
> 	at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
> 	at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:133)
> 	at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:114)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:86)
> 	at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:86)
> 	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:378)
> 	at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:354)
> 	at com.newtouch.sample.SparkHive$.main(SparkHive.scala:25)
> 	at com.newtouch.sample.SparkHive.main(SparkHive.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org