You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takeshi Yamamuro (Jira)" <ji...@apache.org> on 2020/03/19 11:56:00 UTC

[jira] [Resolved] (SPARK-31187) Sort the whole-stage codegen debug output by codegenStageId

     [ https://issues.apache.org/jira/browse/SPARK-31187?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Takeshi Yamamuro resolved SPARK-31187.
--------------------------------------
    Fix Version/s: 3.0.0
         Assignee: Kris Mok
       Resolution: Fixed

Resolved by [https://github.com/apache/spark/pull/27955]

> Sort the whole-stage codegen debug output by codegenStageId
> -----------------------------------------------------------
>
>                 Key: SPARK-31187
>                 URL: https://issues.apache.org/jira/browse/SPARK-31187
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4, 2.4.5, 3.0.0
>            Reporter: Kris Mok
>            Assignee: Kris Mok
>            Priority: Minor
>             Fix For: 3.0.0
>
>
> Spark SQL's whole-stage codegen (WSCG) supports dumping the generated code to help with debugging. One way to get the generated code is through {{df.queryExecution.debug.codegen}}, or SQL {{explain codegen}} statement.
> The generated code is currently printed without specific ordering, which can make debugging a bit annoying. This ticket tracks a minor improvement to sort the codegen dump by the {{codegenStageId}}, ascending.
> After this change, the following query:
> {code}
> spark.range(10).agg(sum('id)).queryExecution.debug.codegen
> {code}
> will always dump the generated code in a natural, stable order.
> The number of codegen stages within a single SQL query tends to be very small, most likely < 50, so the overhead of adding the sorting shouldn't be significant.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org