You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2018/10/01 04:33:00 UTC

[jira] [Assigned] (SPARK-25579) Use quoted attribute names if needed in pushed ORC predicates

     [ https://issues.apache.org/jira/browse/SPARK-25579?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun reassigned SPARK-25579:
-------------------------------------

    Assignee: Dongjoon Hyun

> Use quoted attribute names if needed in pushed ORC predicates
> -------------------------------------------------------------
>
>                 Key: SPARK-25579
>                 URL: https://issues.apache.org/jira/browse/SPARK-25579
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Dongjoon Hyun
>            Assignee: Dongjoon Hyun
>            Priority: Major
>
> This issue aims to fix an ORC performance regression at Spark 2.4.0 RCs from Spark 2.3.2. For column names with `.`, the pushed predicates are ignored.
> *Spark 2.3.2*
> {code:java}
> scala> val df = spark.range(Int.MaxValue).sample(0.2).toDF("col.with.dot")
> scala> df.write.mode("overwrite").orc("/tmp/orc")
> scala> df.write.mode("overwrite").parquet("/tmp/parquet")
> scala> spark.sql("set spark.sql.orc.impl=native")
> scala> spark.sql("set spark.sql.orc.filterPushdown=true")
> scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count)
> Time taken: 803 ms
> scala> spark.time(spark.read.parquet("/tmp/parquet").where("`col.with.dot` = 50000").count)
> Time taken: 5573 ms
> {code}
> *Spark 2.4.0 RC2*
> {code:java}
> scala> spark.time(spark.read.orc("/tmp/orc").where("`col.with.dot` = 50000").count)
> Time taken: 2405 ms{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org