You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2016/08/10 16:53:20 UTC

[jira] [Commented] (SPARK-16994) Filter and limit are illegally permuted.

    [ https://issues.apache.org/jira/browse/SPARK-16994?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15415569#comment-15415569 ] 

Dongjoon Hyun commented on SPARK-16994:
---------------------------------------

Hi, [~TPolzer].
Indeed, it is. `PushDownPredicate` seems to ignore `Limit`. I'll make a PR for this issue soon.

> Filter and limit are illegally permuted.
> ----------------------------------------
>
>                 Key: SPARK-16994
>                 URL: https://issues.apache.org/jira/browse/SPARK-16994
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>            Reporter: TobiasP
>
> {noformat}
> scala> spark.createDataset(1 to 100).limit(10).filter($"value" % 10 === 0).explain
> == Physical Plan ==
> CollectLimit 10
> +- *Filter ((value#875 % 10) = 0)
>    +- LocalTableScan [value#875]
> scala> spark.createDataset(1 to 100).limit(10).filter($"value" % 10 === 0).collect
> res23: Array[Int] = Array(10, 20, 30, 40, 50, 60, 70, 80, 90, 100)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org