You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Cheng Lian (JIRA)" <ji...@apache.org> on 2015/06/07 11:35:00 UTC

[jira] [Resolved] (SPARK-5451) And predicates are not properly pushed down

     [ https://issues.apache.org/jira/browse/SPARK-5451?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Cheng Lian resolved SPARK-5451.
-------------------------------
       Resolution: Fixed
    Fix Version/s: 1.5.0
         Assignee: Thomas Omans

This issue is fixed by upgrading Parquet to 1.7.0 (SPARK-7743).

> And predicates are not properly pushed down
> -------------------------------------------
>
>                 Key: SPARK-5451
>                 URL: https://issues.apache.org/jira/browse/SPARK-5451
>             Project: Spark
>          Issue Type: Sub-task
>          Components: SQL
>    Affects Versions: 1.2.0, 1.2.1
>            Reporter: Cheng Lian
>            Assignee: Thomas Omans
>            Priority: Critical
>             Fix For: 1.5.0
>
>
> This issue is actually caused by PARQUET-173.
> The following {{spark-shell}} session can be used to reproduce this bug:
> {code}
> import org.apache.spark.sql.SQLContext
> val sqlContext = new SQLContext(sc)
> import sc._
> import sqlContext._
> case class KeyValue(key: Int, value: String)
> parallelize(1 to 1024 * 1024 * 20).
>   flatMap(i => Seq.fill(10)(KeyValue(i, i.toString))).
>   saveAsParquetFile("large.parquet")
> parquetFile("large.parquet").registerTempTable("large")
> hadoopConfiguration.set("parquet.task.side.metadata", "false")
> sql("SET spark.sql.parquet.filterPushdown=true")
> sql("SELECT value FROM large WHERE 1024 < value AND value < 2048").collect()
> {code}
> From the log we can find:
> {code}
> There were no row groups that could be dropped due to filter predicates
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org