You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiao Li (JIRA)" <ji...@apache.org> on 2017/08/24 03:37:01 UTC

[jira] [Resolved] (SPARK-21807) The getAliasedConstraints function in LogicalPlan will take a long time when number of expressions is greater than 100

     [ https://issues.apache.org/jira/browse/SPARK-21807?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiao Li resolved SPARK-21807.
-----------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

> The getAliasedConstraints function  in LogicalPlan will take a long time when number of expressions is greater than 100 
> ------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-21807
>                 URL: https://issues.apache.org/jira/browse/SPARK-21807
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.2.0
>            Reporter: eaton
>            Assignee: eaton
>             Fix For: 2.3.0
>
>
> The getAliasedConstraints  fuction in LogicalPlan.scala will clone the expression set when an element added,
> and it will take a long time.
> Before modified, the cost of getAliasedConstraints is:
> 100 expressions:  41 seconds
> 150 expressions:  466 seconds
> The test is like this:
> test("getAliasedConstraints") {
> val expressionNum = 150
> val aggExpression = (1 to expressionNum).map(i => Alias(Count(Literal(1)), s"cnt$i")())
> val aggPlan = Aggregate(Nil, aggExpression, LocalRelation())
> val beginTime = System.currentTimeMillis()
> val expressions = aggPlan.validConstraints
> println(s"validConstraints cost: ${System.currentTimeMillis() - beginTime}ms")
> // The size of Aliased expression is n * (n - 1) / 2 + n
> assert( expressions.size === expressionNum * (expressionNum - 1) / 2 + expressionNum)
> }



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org