You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Weiqing Yang (JIRA)" <ji...@apache.org> on 2016/07/02 00:33:11 UTC

[jira] [Commented] (SPARK-15923) Spark Application rest api returns "no such app: "

    [ https://issues.apache.org/jira/browse/SPARK-15923?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15359872#comment-15359872 ] 

Weiqing Yang commented on SPARK-15923:
--------------------------------------

Debugged in the cluster. Whether the cluster is secure or unsecure, this issue happens. And only applications in yarn-client mode have this issue.

Detailed jira description:
1. yarn-client mode:
Applications in yarn-client mode, they donot have 'attemptId' in their records, e.g:
"id": "application_1465778870517_0001",
"name": "Spark Pi",
"attempts": [
{"startTime": "2016-06-13T01:07:16.958GMT", "endTime" : "2016-06-13T01:09:29.668GMT", "sparkUser" : "hrt_qa", "completed" : true }
]
So when checking the web UI for executors’ information, the link used is http://<host>:18080/history/application_1465778870517_0001/executors/, which shows all the executors’ information. Note: it does not have attemptId inside the link. On the other hand, if calling the rest API: http://<host>:18080/api/v1/applications/application_1465778870517_0001/1/executors, it has attemptId "1" inside, and gets errors like "no such app" and "INFO ApplicationCache: Failed to load application attempt application_1465778870517_0001/Some(1)" . Instead, if you try the rest API: "http://<hostid>:18080/api/v1/applications/application_1465778870517_0001/executors", which has no attemptId inside, we can see all the executors’ information.

2. yarn-cluster mode:
Applications in yarn-cluster mode. They do have 'attemptId' in their record, e.g.:
"id" : "application_1465778870517_0002",
"name" : "Spark Pi",
"attempts" : [
{"attemptId": "1", "startTime" : "2016-06-13T01:12:48.797GMT", "endTime" : "2016-06-13T01:14:26.900GMT", "sparkUser" : "hrt_qa", "completed" : true }
]
We can check executor information by web UI and rest API since both of them have attemptId “1”:
http://<hostid>:18080/history/application_1465778870517_0002/1/executors/
http://<hostid>:18080/api/v1/applications/application_1465778870517_0002/1/executors

Summary:
When checking job/executor information by rest APIs, the "attemptId" is included inside. However, in yarn client mode, there will be no attempt ID.

I am going to make a pull request for review.

> Spark Application rest api returns "no such app: <appId>"
> ---------------------------------------------------------
>
>                 Key: SPARK-15923
>                 URL: https://issues.apache.org/jira/browse/SPARK-15923
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.6.1
>            Reporter: Yesha Vora
>
> Env : secure cluster
> Scenario:
> * Run SparkPi application in yarn-client or yarn-cluster mode
> * After application finishes, check Spark HS rest api to get details like jobs / executor etc. 
> {code}
> http://<host>:18080/api/v1/applications/application_1465778870517_0001/1/executors{code} 
> Rest api return HTTP Code: 404 and prints "HTTP Data: no such app: application_1465778870517_0001"



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org