You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (Jira)" <ji...@apache.org> on 2021/04/20 13:58:00 UTC

[jira] [Resolved] (SPARK-34877) Add Spark AM Log link in case of master as yarn and deploy mode as client

     [ https://issues.apache.org/jira/browse/SPARK-34877?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Thomas Graves resolved SPARK-34877.
-----------------------------------
    Fix Version/s: 3.2.0
         Assignee: Saurabh Chawla
       Resolution: Fixed

> Add Spark AM Log link in case of master as yarn and deploy mode as client
> -------------------------------------------------------------------------
>
>                 Key: SPARK-34877
>                 URL: https://issues.apache.org/jira/browse/SPARK-34877
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core, YARN
>    Affects Versions: 3.1.1
>            Reporter: Saurabh Chawla
>            Assignee: Saurabh Chawla
>            Priority: Minor
>             Fix For: 3.2.0
>
>
> On Running Spark job with yarn and deployment mode as client, Spark Driver and Spark Application master launch in two separate containers. In various scenarios there is need to see Spark Application master logs to see the resource allocation, Decommissioning status and other information shared between yarn RM and Spark Application master.
> Till now the only way to check this by finding the container id of the AM and check the logs either using Yarn utility or Yarn RM Application History server. 
> This Jira is for adding the spark AM log link for spark job running in the client mode for yarn. Instead of searching the container id and then find the logs. We can directly check in the Spark UI



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org