You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2021/05/12 03:55:00 UTC
[jira] [Commented] (SPARK-35379) Improve
InferFiltersFromConstraints rule performance when parsing spark sql
[ https://issues.apache.org/jira/browse/SPARK-35379?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17342992#comment-17342992 ]
Apache Spark commented on SPARK-35379:
--------------------------------------
User 'wankunde' has created a pull request for this issue:
https://github.com/apache/spark/pull/32514
> Improve InferFiltersFromConstraints rule performance when parsing spark sql
> ---------------------------------------------------------------------------
>
> Key: SPARK-35379
> URL: https://issues.apache.org/jira/browse/SPARK-35379
> Project: Spark
> Issue Type: Improvement
> Components: SQL
> Affects Versions: 3.1.1
> Reporter: Wan Kun
> Priority: Major
>
> *InferFiltersFromConstraints* rule will generates too many constraints when there are many aliases in Project, . For example:
>
> {code:java}
> test("Expression explosion when analyze test") {
> RuleExecutor.resetMetrics()
> Seq((1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14))
> .toDF("a", "b", "c", "d", "e", "f", "g", "h", "i", "j",
> "k", "l", "m", "n")
> .write.saveAsTable("test")
> val df = spark.table("test")
> val df2 = df.filter("a+b+c+d+e+f+g+h+i+j+k+l+m+n > 100")
> val df3 = df2.select('a as 'a1, 'b as 'b1,
> 'c as 'c1, 'd as 'd1, 'e as 'e1, 'f as 'f1,
> 'g as 'g1, 'h as 'h1, 'i as 'i1, 'j as 'j1,
> 'k as 'k1, 'l as 'l1, 'm as 'm1, 'n as 'n1)
> val df4 = df3.join(df2, df3("a1") === df2("a"))
> df4.explain(true)
> logWarning(RuleExecutor.dumpTimeSpent())
> }
> {code}
> So we need to optimize *InferFiltersFromConstraints* rule to improve performance.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org