You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/09/13 05:46:20 UTC

[jira] [Commented] (SPARK-17517) Improve generated Code for BroadcastHashJoinExec

    [ https://issues.apache.org/jira/browse/SPARK-17517?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15486335#comment-15486335 ] 

Apache Spark commented on SPARK-17517:
--------------------------------------

User 'yaooqinn' has created a pull request for this issue:
https://github.com/apache/spark/pull/15071

> Improve generated Code for BroadcastHashJoinExec
> ------------------------------------------------
>
>                 Key: SPARK-17517
>                 URL: https://issues.apache.org/jira/browse/SPARK-17517
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>            Reporter: Kent Yao
>             Fix For: 2.1.0
>
>
> For current `BroadcastHashJoinExec`, we generate join code for key is not unique like this: 
> ```java
> while (matches.hasnext)
>     matched = matches.next
>     check and read stream side row fields
>     check and read build side row fieldes
>     reset result row
>     write stream side row fields to result row
>     write stream side row fields to result row
> ```
> For some cases, we don't need to check/read/write the steam side repeatedly in such while circle, e.g. `Inner Join with BuildRight`, or `BuildLeft &&  all left side fields are fixed length` and so on. we may generate the code as below:
> ```java
> check and read stream side row fields
> reset result row
> write stream side row fields to result row
> while (matches.hasnext)
>     matched = matches.next
>     check and read build side row fieldes
>     write stream side row fields to result row
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org