You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2019/05/21 03:59:39 UTC

[jira] [Updated] (SPARK-20800) Allow users to set job group when connecting through the SQL thrift server

     [ https://issues.apache.org/jira/browse/SPARK-20800?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-20800:
---------------------------------
    Labels: bulk-closed  (was: )

> Allow users to set job group when connecting through the SQL thrift server
> --------------------------------------------------------------------------
>
>                 Key: SPARK-20800
>                 URL: https://issues.apache.org/jira/browse/SPARK-20800
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 2.1.0
>            Reporter: Tim Zeyl
>            Priority: Minor
>              Labels: bulk-closed
>
> It would be useful for users to be able to set the job group through thrift server clients like beeline so that jobs in the event log could be grouped together logically. This would help in tracking the performance of repeated runs of similar sql queries, which could be tagged by the user with the same job group id. Currently each sql query, and corresponding job, is assigned a random UUID as the job group.
> Ideally users could set the job group in two ways:
> 1. by issuing a sql command prior to their query (for example, SET spark.sql.thriftserver.jobGroupID=jobA)
> 2. by passing a hive conf parameter through beeline to set the job group for the session.
> Alternatively, if people think the job group needs to be a random UUID for each sql query, introducing another parameter that could be written into the job properties field of the event log would be helpful for tracking the performance of repeated runs.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org