You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sungpeo Kook (Jira)" <ji...@apache.org> on 2021/08/25 07:22:00 UTC
[jira] [Created] (SPARK-36582) Spark HistoryPage show 'NotFound' in
not logged multiple attempts
Sungpeo Kook created SPARK-36582:
------------------------------------
Summary: Spark HistoryPage show 'NotFound' in not logged multiple attempts
Key: SPARK-36582
URL: https://issues.apache.org/jira/browse/SPARK-36582
Project: Spark
Issue Type: Bug
Components: Web UI
Affects Versions: 3.1.2, 2.4.8
Reporter: Sungpeo Kook
Current historypage show a attemptId column in case of hasMultipleAttempts is true.
Moreover if hasMultipleAttempts is false, remove the attemptId column.
But, applications in yarn could be failed even not logged to spark application history.
application_1628518360417_0028's attemptId is 2, but size of attempt in spark history is 1.
For this case, the attemptId column is needed.
{code:java}
[
{
"id": "application_1628518360417_0029",
"name": "Spark Pi",
"attempts": [
{
"attemptId": "1",
"startTime": "2021-08-25T05:20:15.521GMT",
"endTime": "2021-08-25T05:20:30.398GMT",
"lastUpdated": "2021-08-25T05:20:30.475GMT",
"duration": 14877,
"sparkUser": "elixir-kook",
"completed": true,
"appSparkVersion": "2.4.5",
"endTimeEpoch": 1629868830398,
"lastUpdatedEpoch": 1629868830475,
"startTimeEpoch": 1629868815521
}
]
},
{
"id": "application_1628518360417_0028",
"name": "Spark Pi",
"attempts": [
{
"attemptId": "2",
"startTime": "2021-08-25T05:19:22.850GMT",
"endTime": "2021-08-25T05:19:44.662GMT",
"lastUpdated": "2021-08-25T05:19:44.726GMT",
"duration": 21812,
"sparkUser": "elixir-kook",
"completed": true,
"appSparkVersion": "2.4.5",
"endTimeEpoch": 1629868784662,
"lastUpdatedEpoch": 1629868784726,
"startTimeEpoch": 1629868762850
}
]
},
....
]
{code}
attemptId should be exists.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org