You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/12/04 18:51:00 UTC

[jira] [Assigned] (SPARK-33663) Fix misleading message for uncaching when createOrReplaceTempView is called

     [ https://issues.apache.org/jira/browse/SPARK-33663?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-33663:
------------------------------------

    Assignee: Apache Spark

> Fix misleading message for uncaching when createOrReplaceTempView is called
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-33663
>                 URL: https://issues.apache.org/jira/browse/SPARK-33663
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Terry Kim
>            Assignee: Apache Spark
>            Priority: Minor
>
> To repro:
> {code:java}
> scala> sql("CREATE TABLE table USING parquet AS SELECT 2")
> res0: org.apache.spark.sql.DataFrame = []                                       
> scala> val df = spark.table("table")
> df: org.apache.spark.sql.DataFrame = [2: int]
> scala> df.createOrReplaceTempView("t2")
> 20/12/04 10:16:24 WARN CommandUtils: Exception when attempting to uncache $name
> org.apache.spark.sql.AnalysisException: Table or view not found: t2;;
> 'UnresolvedRelation [t2], [], false
> 	at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42)
> 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1(CheckAnalysis.scala:113)
> 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis$1$adapted(CheckAnalysis.scala:93)
> 	at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:183)
> 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis(CheckAnalysis.scala:93)
> 	at org.apache.spark.sql.catalyst.analysis.CheckAnalysis.checkAnalysis$(CheckAnalysis.scala:90)
> 	at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:152)
> 	at org.apache.spark.sql.catalyst.analysis.Analyzer.$anonfun$executeAndCheck$1(Analyzer.scala:172)
> 	at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:214)
> 	at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:169)
> 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$analyzed$1(QueryExecution.scala:73)
> 	at org.apache.spark.sql.catalyst.QueryPlanningTracker.measurePhase(QueryPlanningTracker.scala:111)
> 	at org.apache.spark.sql.execution.QueryExecution.$anonfun$executePhase$1(QueryExecution.scala:138)
> 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:768)
> 	at org.apache.spark.sql.execution.QueryExecution.executePhase(QueryExecution.scala:138)
> 	at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:73)
> 	at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:71)
> 	at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:63)
> 	at org.apache.spark.sql.Dataset$.$anonfun$ofRows$1(Dataset.scala:90)
> 	at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:768)
> 	at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:88)
> 	at org.apache.spark.sql.DataFrameReader.table(DataFrameReader.scala:889)
> 	at org.apache.spark.sql.SparkSession.table(SparkSession.scala:589)
> 	at org.apache.spark.sql.internal.CatalogImpl.uncacheTable(CatalogImpl.scala:476)
> 	at org.apache.spark.sql.execution.command.CommandUtils$.uncacheTableOrView(CommandUtils.scala:392)
> 	at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:124)
> {code}
> It shouldn't log because `t2` does not exist yet.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org