You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/03/19 00:12:42 UTC
[jira] [Commented] (SPARK-19993) Caching logical plans containing
subquery expressions does not work.
[ https://issues.apache.org/jira/browse/SPARK-19993?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15931497#comment-15931497 ]
Apache Spark commented on SPARK-19993:
--------------------------------------
User 'dilipbiswal' has created a pull request for this issue:
https://github.com/apache/spark/pull/17330
> Caching logical plans containing subquery expressions does not work.
> --------------------------------------------------------------------
>
> Key: SPARK-19993
> URL: https://issues.apache.org/jira/browse/SPARK-19993
> Project: Spark
> Issue Type: Sub-task
> Components: SQL
> Affects Versions: 2.1.0
> Reporter: Dilip Biswal
>
> Here is a simple repro that depicts the problem. In this case the second invocation of the sql should have been from the cache. However the lookup fails currently.
> {code}
> scala> val ds = spark.sql("select * from s1 where s1.c1 in (select s2.c1 from s2 where s1.c1 = s2.c1)")
> ds: org.apache.spark.sql.DataFrame = [c1: int]
> scala> ds.cache
> res13: ds.type = [c1: int]
> scala> spark.sql("select * from s1 where s1.c1 in (select s2.c1 from s2 where s1.c1 = s2.c1)").explain(true)
> == Analyzed Logical Plan ==
> c1: int
> Project [c1#86]
> +- Filter c1#86 IN (list#78 [c1#86])
> : +- Project [c1#87]
> : +- Filter (outer(c1#86) = c1#87)
> : +- SubqueryAlias s2
> : +- Relation[c1#87] parquet
> +- SubqueryAlias s1
> +- Relation[c1#86] parquet
> == Optimized Logical Plan ==
> Join LeftSemi, ((c1#86 = c1#87) && (c1#86 = c1#87))
> :- Relation[c1#86] parquet
> +- Relation[c1#87] parquet
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org