You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 04:35:49 UTC

[jira] [Resolved] (SPARK-17517) Improve generated Code for BroadcastHashJoinExec

     [ https://issues.apache.org/jira/browse/SPARK-17517?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-17517.
----------------------------------
    Resolution: Incomplete

> Improve generated Code for BroadcastHashJoinExec
> ------------------------------------------------
>
>                 Key: SPARK-17517
>                 URL: https://issues.apache.org/jira/browse/SPARK-17517
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Kent Yao
>            Priority: Major
>              Labels: bulk-closed
>
> For current `BroadcastHashJoinExec`, we generate join code for key is not unique like this: 
> {code:title=processNext.java|borderStyle=solid}
> while (matches.hasnext) {
>     matched = matches.next
>     check and read stream side row fields
>     check and read build side row fieldes
>     reset result row
>     write stream side row fields to result row
>     write bulid side row fields to result row
>     append(result row)
> }
> {code}
> For some cases, we don't need to check/read/write the steam side repeatedly in such while circle, e.g. `Inner Join with BuildRight`, or `BuildLeft &&  all left side fields are fixed length` and so on. we may generate the code as below:
> {code:title=processNext.java|borderStyle=solid}
> check and read stream side row fields
> reset result row
> write stream side row fields to result row
> while (matches.hasnext)
> {
>     matched = matches.next
>     check and read build side row fieldes
>     write bulid side row fields to result row
>     append(result row)
> }
> {code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org