You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yuming Wang (Jira)" <ji...@apache.org> on 2022/03/25 00:59:00 UTC

[jira] [Resolved] (SPARK-38570) Incorrect DynamicPartitionPruning caused by Literal

     [ https://issues.apache.org/jira/browse/SPARK-38570?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yuming Wang resolved SPARK-38570.
---------------------------------
    Fix Version/s: 3.3.0
       Resolution: Fixed

Issue resolved by pull request 35878
https://github.com/apache/spark/pull/35878

> Incorrect DynamicPartitionPruning caused by Literal
> ---------------------------------------------------
>
>                 Key: SPARK-38570
>                 URL: https://issues.apache.org/jira/browse/SPARK-38570
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.2.0
>            Reporter: mcdull_zhang
>            Assignee: mcdull_zhang
>            Priority: Minor
>             Fix For: 3.3.0
>
>
> The return value of Literal.references is an empty AttributeSet, so Literal is mistaken for a partition column.
>  
> org.apache.spark.sql.execution.dynamicpruning.PartitionPruning#getFilterableTableScan:
> {code:java}
> val srcInfo: Option[(Expression, LogicalPlan)] = findExpressionAndTrackLineageDown(a, plan)
> srcInfo.flatMap {
>   case (resExp, l: LogicalRelation) =>
>     l.relation match {
>       case fs: HadoopFsRelation =>
>         val partitionColumns = AttributeSet(
>           l.resolve(fs.partitionSchema, fs.sparkSession.sessionState.analyzer.resolver))
>         // When resExp is a Literal, Literal is considered a partition column.         
>         if (resExp.references.subsetOf(partitionColumns)) {
>           return Some(l)
>         } else {
>           None
>         }
>       case _ => None
>     } {code}



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org