You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/10/18 09:53:00 UTC

[jira] [Commented] (SPARK-33168) spark REST API Unable to get JobDescription

    [ https://issues.apache.org/jira/browse/SPARK-33168?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17216154#comment-17216154 ] 

Hyukjin Kwon commented on SPARK-33168:
--------------------------------------

I can retrieve the job description as below:


{code:java}
localhost:4040/api/v1/applications/local-1603014688155/jobs
[ {
  "jobId" : 0,
  "name" : "count at <console>:24",
  "description" : "test_count",
  "submissionTime" : "2020-10-18T09:51:32.690GMT",
  "completionTime" : "2020-10-18T09:51:33.473GMT",
  "stageIds" : [ 0, 1 ],
  "status" : "SUCCEEDED",
  "numTasks" : 17,
  "numActiveTasks" : 0,
  "numCompletedTasks" : 17,
  "numSkippedTasks" : 0,
  "numFailedTasks" : 0,
  "numKilledTasks" : 0,
  "numCompletedIndices" : 17,
  "numActiveStages" : 0,
  "numCompletedStages" : 2,
  "numSkippedStages" : 0,
  "numFailedStages" : 0,
  "killedTasksSummary" : { }
} ]%
{code}

> spark REST API Unable to get JobDescription
> -------------------------------------------
>
>                 Key: SPARK-33168
>                 URL: https://issues.apache.org/jira/browse/SPARK-33168
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.4
>            Reporter: zhaoyachao
>            Priority: Major
>
> spark set job description ,use spark REST API (localhost:4040/api/v1/applications/xxx/jobs)unable to get job description,but it can be displayed at localhost:4040/jobs
> spark.sparkContext.setJobDescription({color:#6a8759}"test_count"{color})
> spark.range({color:#6897bb}100{color}).count()



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org