You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "xufei (Jira)" <ji...@apache.org> on 2019/11/26 02:03:00 UTC

[jira] [Updated] (SPARK-30014) for non-default catalog, namespace name is always needed in a query?

     [ https://issues.apache.org/jira/browse/SPARK-30014?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

xufei updated SPARK-30014:
--------------------------
    Attachment: DataSourceV2ExplainSuite.scala

> for non-default catalog, namespace name is always needed in a query?
> --------------------------------------------------------------------
>
>                 Key: SPARK-30014
>                 URL: https://issues.apache.org/jira/browse/SPARK-30014
>             Project: Spark
>          Issue Type: Question
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: xufei
>            Priority: Major
>         Attachments: DataSourceV2ExplainSuite.scala
>
>
> Hi,
> I'm trying to write a catalog plugin based on spark-3.0-preview,  and I found even when I use 'use catalog.namespace' to set the current catalog and namespace, I still need to qualified name in the query.
> For example, I add a catalog named 'example_catalog', and there is a database named 'test' in 'example_catalog', a table 't' in 'example_catalog.test'
> I can query the table using 'select * from example_catalog.test.t' under default catalog(which is spark_catalog)
> then I use 'use example_catalog.test' to change the current catalog to 'example_catalog', and the current namespace to 'test'
> I that can query the table using 'select * from test.t'
> but 'select * from t' failed due to table_not_found exception
> I want to known is this an expected behavior?  If yes, it sounds a little weird since I think after 'use example_catalog.test', all the un-qualified identifiers should be interpreted as 'example_catalog.test.identifier'
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org