You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/10/30 06:15:27 UTC

[jira] [Commented] (SPARK-10829) Scan DataSource with predicate expression combine partition key and attributes doesn't work

    [ https://issues.apache.org/jira/browse/SPARK-10829?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14981915#comment-14981915 ] 

Apache Spark commented on SPARK-10829:
--------------------------------------

User 'cloud-fan' has created a pull request for this issue:
https://github.com/apache/spark/pull/9370

> Scan DataSource with predicate expression combine partition key and attributes doesn't work
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-10829
>                 URL: https://issues.apache.org/jira/browse/SPARK-10829
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Cheng Hao
>            Assignee: Cheng Hao
>            Priority: Critical
>             Fix For: 1.6.0
>
>
> To reproduce that with the code:
> {code}
> withSQLConf(SQLConf.PARQUET_FILTER_PUSHDOWN_ENABLED.key -> "true") {
>       withTempPath { dir =>
>         val path = s"${dir.getCanonicalPath}/part=1"
>         (1 to 3).map(i => (i, i.toString)).toDF("a", "b").write.parquet(path)
>         // If the "part = 1" filter gets pushed down, this query will throw an exception since
>         // "part" is not a valid column in the actual Parquet file
>         checkAnswer(
>           sqlContext.read.parquet(path).filter("a > 0 and (part = 0 or a > 1)"),
>           (2 to 3).map(i => Row(i, i.toString, 1)))
>       }
>     }
> {code}
> We expect the result as:
> {code}
> 2, 1
> 3, 1
> {code}
> But we got:
> {code}
> 1, 1
> 2, 1
> 3, 1
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org