You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ohad Raviv (JIRA)" <ji...@apache.org> on 2018/09/24 07:16:00 UTC

[jira] [Updated] (SPARK-23985) predicate push down doesn't work with simple compound partition spec

     [ https://issues.apache.org/jira/browse/SPARK-23985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ohad Raviv updated SPARK-23985:
-------------------------------
    Description: 
while predicate push down works with this query: 
{code:sql}
select * from (
   select *, row_number() over (partition by a order by b) from t1
)z 
where a>1
{code}
it dowsn't work with:
{code:sql}
select * from (
   select *, row_number() over (partition by concat(a,'lit') order by b) from t1
)z 
where a>1
{code}
 
 I added a test to FilterPushdownSuite which I think recreates the problem:
{code}
  test("Window: predicate push down -- ohad") {
    val winExpr = windowExpr(count('b),
      windowSpec(Concat('a :: Nil) :: Nil, 'b.asc :: Nil, UnspecifiedFrame))

    val originalQuery = testRelation.select('a, 'b, 'c, winExpr.as('window)).where('a > 1)
    val correctAnswer = testRelation
      .where('a > 1).select('a, 'b, 'c)
      .window(winExpr.as('window) :: Nil, 'a :: Nil, 'b.asc :: Nil)
      .select('a, 'b, 'c, 'window).analyze

    comparePlans(Optimize.execute(originalQuery.analyze), correctAnswer)
  }
{code}
will try to create a PR with a correction

  was:
while predicate push down works with this query: 
{code:sql}
select *, row_number() over (partition by a order by b) from t1 where a>1
{code}
it dowsn't work with:
{code:sql}
select *, row_number() over (partition by concat(a,'lit') order by b) from t1 where a>1
{code}
 
I added a test to FilterPushdownSuite which I think recreates the problem:
{code:scala}
  test("Window: predicate push down -- ohad") {
    val winExpr = windowExpr(count('b),
      windowSpec(Concat('a :: Nil) :: Nil, 'b.asc :: Nil, UnspecifiedFrame))

    val originalQuery = testRelation.select('a, 'b, 'c, winExpr.as('window)).where('a > 1)
    val correctAnswer = testRelation
      .where('a > 1).select('a, 'b, 'c)
      .window(winExpr.as('window) :: Nil, 'a :: Nil, 'b.asc :: Nil)
      .select('a, 'b, 'c, 'window).analyze

    comparePlans(Optimize.execute(originalQuery.analyze), correctAnswer)
  }
{code}

will try to create a PR with a correction



> predicate push down doesn't work with simple compound partition spec
> --------------------------------------------------------------------
>
>                 Key: SPARK-23985
>                 URL: https://issues.apache.org/jira/browse/SPARK-23985
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0
>            Reporter: Ohad Raviv
>            Priority: Minor
>
> while predicate push down works with this query: 
> {code:sql}
> select * from (
>    select *, row_number() over (partition by a order by b) from t1
> )z 
> where a>1
> {code}
> it dowsn't work with:
> {code:sql}
> select * from (
>    select *, row_number() over (partition by concat(a,'lit') order by b) from t1
> )z 
> where a>1
> {code}
>  
>  I added a test to FilterPushdownSuite which I think recreates the problem:
> {code}
>   test("Window: predicate push down -- ohad") {
>     val winExpr = windowExpr(count('b),
>       windowSpec(Concat('a :: Nil) :: Nil, 'b.asc :: Nil, UnspecifiedFrame))
>     val originalQuery = testRelation.select('a, 'b, 'c, winExpr.as('window)).where('a > 1)
>     val correctAnswer = testRelation
>       .where('a > 1).select('a, 'b, 'c)
>       .window(winExpr.as('window) :: Nil, 'a :: Nil, 'b.asc :: Nil)
>       .select('a, 'b, 'c, 'window).analyze
>     comparePlans(Optimize.execute(originalQuery.analyze), correctAnswer)
>   }
> {code}
> will try to create a PR with a correction



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org