You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (JIRA)" <ji...@apache.org> on 2019/07/16 16:42:10 UTC
[jira] [Updated] (SPARK-23985) predicate push down doesn't work
with simple compound partition spec
[ https://issues.apache.org/jira/browse/SPARK-23985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-23985:
----------------------------------
Affects Version/s: (was: 2.4.0)
3.0.0
> predicate push down doesn't work with simple compound partition spec
> --------------------------------------------------------------------
>
> Key: SPARK-23985
> URL: https://issues.apache.org/jira/browse/SPARK-23985
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Ohad Raviv
> Priority: Minor
>
> while predicate push down works with this query:
> {code:sql}
> select * from (
> select *, row_number() over (partition by a order by b) from t1
> )z
> where a>1
> {code}
> it dowsn't work with:
> {code:sql}
> select * from (
> select *, row_number() over (partition by concat(a,'lit') order by b) from t1
> )z
> where a>1
> {code}
>
> I added a test to FilterPushdownSuite which I think recreates the problem:
> {code}
> test("Window: predicate push down -- ohad") {
> val winExpr = windowExpr(count('b),
> windowSpec(Concat('a :: Nil) :: Nil, 'b.asc :: Nil, UnspecifiedFrame))
> val originalQuery = testRelation.select('a, 'b, 'c, winExpr.as('window)).where('a > 1)
> val correctAnswer = testRelation
> .where('a > 1).select('a, 'b, 'c)
> .window(winExpr.as('window) :: Nil, 'a :: Nil, 'b.asc :: Nil)
> .select('a, 'b, 'c, 'window).analyze
> comparePlans(Optimize.execute(originalQuery.analyze), correctAnswer)
> }
> {code}
> will try to create a PR with a correction
--
This message was sent by Atlassian JIRA
(v7.6.14#76016)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org