You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "huangtengfei (Jira)" <ji...@apache.org> on 2021/09/03 02:37:00 UTC

[jira] [Comment Edited] (SPARK-36658) Expose executionId to QueryExecutionListener

    [ https://issues.apache.org/jira/browse/SPARK-36658?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17409210#comment-17409210 ] 

huangtengfei edited comment on SPARK-36658 at 9/3/21, 2:36 AM:
---------------------------------------------------------------

cc [~cloud_fan] could you share thoughts about this?


was (Author: ivoson):
cc [~cloud_fan] could you share any thoughts about this?

> Expose executionId to QueryExecutionListener
> --------------------------------------------
>
>                 Key: SPARK-36658
>                 URL: https://issues.apache.org/jira/browse/SPARK-36658
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.2
>            Reporter: huangtengfei
>            Priority: Minor
>
> Now in [QueryExecutionListener|https://github.com/apache/spark/blob/v3.2.0-rc2/sql/core/src/main/scala/org/apache/spark/sql/util/QueryExecutionListener.scala#L38] we have exposed API to get the query execution information:
> def onSuccess(funcName: String, qe: QueryExecution, durationNs: Long): Unit
> def onFailure(funcName: String, qe: QueryExecution, exception: Exception): Unit
>  
> But we can not get a clear information that which query is this. In Spark SQL, I think that executionId is the direct identifier of a query execution. So I think it make sense to expose executionId to the QueryExecutionListener, so that people can easily find the exact query in UI or history server to track more information of the query execution. And there is no easy way we can find the relevant executionId from a QueryExecution object. 
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org