You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Lantao Jin (Jira)" <ji...@apache.org> on 2020/08/05 07:02:00 UTC

[jira] [Commented] (SPARK-32535) Query with broadcast hints fail when query has a WITH clause

    [ https://issues.apache.org/jira/browse/SPARK-32535?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17171300#comment-17171300 ] 

Lantao Jin commented on SPARK-32535:
------------------------------------

I think this issue has fixed by SPARK-32237. I cannot reproduce it in master.

> Query with broadcast hints fail when query has a WITH clause
> ------------------------------------------------------------
>
>                 Key: SPARK-32535
>                 URL: https://issues.apache.org/jira/browse/SPARK-32535
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0
>            Reporter: Arvind Krishnan
>            Priority: Major
>
> If a query has a WITH clause and a query hint (like `BROADCAST`), the query fails
> In the below code sample, executing `sql2` fails, but `sql1` passes.
> {code:java}
> import spark.implicits._
> val df = List(
>   ("1", "B", "C"),
>   ("A", "2", "C"),
>   ("A", "B", "3")
> ).toDF("COL_A", "COL_B", "COL_C")
> df.createOrReplaceTempView("table1")
> val df1 = List(
>   ("A", "2", "3"),
>   ("1", "B", "3"),
>   ("1", "2", "C")
> ).toDF("COL_A", "COL_B", "COL_C")
> df1.createOrReplaceTempView("table2")
> val sql1 = "select /*+ BROADCAST(a) */ a.COL_A from table1 a inner join table2 b on a.COL_A = b.COL_A"
> val sql2 = "with X as (select /*+ BROADCAST(a) */ a.COL_A from table1 a inner join table2 b on a.COL_A = b.COL_A) select X.COL_A from X"
> val df2 = spark.sql(sql2)
> println(s"Row Count ${df2.count()}")
> println("Rows... ")
> df2.show(false)
> {code}
>  
> I tried executing this sample program with spark2.4.0, and both sql statements work



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org