You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/12/24 18:14:00 UTC
[jira] [Assigned] (SPARK-33904) Recognize `spark_catalog` in
`saveAsTable()` and `insertInto()`
[ https://issues.apache.org/jira/browse/SPARK-33904?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-33904:
------------------------------------
Assignee: Apache Spark
> Recognize `spark_catalog` in `saveAsTable()` and `insertInto()`
> ---------------------------------------------------------------
>
> Key: SPARK-33904
> URL: https://issues.apache.org/jira/browse/SPARK-33904
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 3.2.0
> Reporter: Maxim Gekk
> Assignee: Apache Spark
> Priority: Major
>
> The v1 INSERT INTO command recognizes `spark_catalog` as the default session catalog:
> {code:sql}
> spark-sql> create table spark_catalog.ns.tbl (c int);
> spark-sql> insert into spark_catalog.ns.tbl select 0;
> spark-sql> select * from spark_catalog.ns.tbl;
> 0
> {code}
> but the `saveAsTable()` and `insertInto()` methods don't allow to write a table with explicitly specified catalog spark_catalog:
> {code:scala}
> scala> sql("CREATE NAMESPACE spark_catalog.ns")
> scala> Seq(0).toDF().write.saveAsTable("spark_catalog.ns.tbl")
> org.apache.spark.sql.AnalysisException: Couldn't find a catalog to handle the identifier spark_catalog.ns.tbl.
> at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:629)
> ... 47 elided
> scala> Seq(0).toDF().write.insertInto("spark_catalog.ns.tbl")
> org.apache.spark.sql.AnalysisException: Couldn't find a catalog to handle the identifier spark_catalog.ns.tbl.
> at org.apache.spark.sql.DataFrameWriter.insertInto(DataFrameWriter.scala:498)
> ... 47 elided
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org