You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "caoxuewen (JIRA)" <ji...@apache.org> on 2017/08/11 10:23:00 UTC

[jira] [Created] (SPARK-21707) Improvement a special case for non-deterministic filters in optimizer

caoxuewen created SPARK-21707:
---------------------------------

             Summary: Improvement a special case for non-deterministic filters in optimizer
                 Key: SPARK-21707
                 URL: https://issues.apache.org/jira/browse/SPARK-21707
             Project: Spark
          Issue Type: Improvement
          Components: SQL
    Affects Versions: 2.3.0
            Reporter: caoxuewen


Currently, Did a lot of special handling for non-deterministic projects and filters in optimizer. but not good enough. this patch add a new special case for non-deterministic filters. Deal with that we only need to read user needs fields for non-deterministic filters in optimizer.
For example, the condition of filters is nondeterministic. e.g:contains nondeterministic function(rand function), HiveTableScans optimizer generated:

```
HiveTableScans plan:Aggregate [k#2L], [k#2L, k#2L, sum(cast(id#1 as bigint)) AS sum(id)#395L]
+- Project [d004#205 AS id#1, CEIL(c010#214) AS k#2L]
   +- Filter ((isnotnull(d004#205) && (rand(-4530215890880734772) <= 0.5)) && NOT (cast(cast(d004#205 as decimal(10,0)) as decimal(11,1)) = 0.0))
      +- MetastoreRelation XXX_database, XXX_table

HiveTableScans plan:Project [d004#205 AS id#1, CEIL(c010#214) AS k#2L]
+- Filter ((isnotnull(d004#205) && (rand(-4530215890880734772) <= 0.5)) && NOT (cast(cast(d004#205 as decimal(10,0)) as decimal(11,1)) = 0.0))
   +- MetastoreRelation XXX_database, XXX_table

HiveTableScans plan:Filter ((isnotnull(d004#205) && (rand(-4530215890880734772) <= 0.5)) && NOT (cast(cast(d004#205 as decimal(10,0)) as decimal(11,1)) = 0.0))
+- MetastoreRelation XXX_database, XXX_table

HiveTableScans plan:MetastoreRelation XXX_database, XXX_table

```
so HiveTableScan will read all the fields from table. but we only need to ‘d004’ and 'c010' . it will affect the performance of task.





--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org