You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:23:43 UTC

[jira] [Updated] (SPARK-15486) dropTempTable does not work with backticks

     [ https://issues.apache.org/jira/browse/SPARK-15486?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-15486:
---------------------------------
    Labels: bulk-closed  (was: )

> dropTempTable does not work with backticks
> ------------------------------------------
>
>                 Key: SPARK-15486
>                 URL: https://issues.apache.org/jira/browse/SPARK-15486
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>            Reporter: Luca Bruno
>            Priority: Major
>              Labels: bulk-closed
>
> We're using spark commit db75ccb (not sure if that's unreleased 2.0.0 or 2.1.0).
> We don't use Hive as we have a custom filesystem hierarchy and we like to use dots in table names. For this reason we use backticks when registering temporary tables.
> We have noticed that dropTempTable doesn't work as expected when using backticks, as can be reproduced with the code below:
> {code}
> from pyspark import SparkContext
> from pyspark.sql import SQLContext
> sc = SparkContext()
> sqlc = SQLContext(sc)
> data = sc.parallelize([ { "col": "val" } ])
> df = sqlc.createDataFrame(data)
> df.registerTempTable("`a.b.c`")
> print sqlc.sql("select * from `a.b.c`").collect()
> sqlc.dropTempTable("`a.b.c`")
> print sqlc.sql("select * from `a.b.c`").collect()
> {code}
> The above code will print the dataframe twice. We instead expect the second collect to fail because the table shouldn't exist. It seems the dropTempTable is failing silently.
> Removing backticks from registerTempTable or dropTempTable is not an option because we'd get an invalid syntax exception.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org