You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2014/11/10 08:43:34 UTC

[jira] [Created] (SPARK-4313) "Thread Dump" link is broken in yarn-cluster mode

Shixiong Zhu created SPARK-4313:
-----------------------------------

             Summary: "Thread Dump" link is broken in yarn-cluster mode
                 Key: SPARK-4313
                 URL: https://issues.apache.org/jira/browse/SPARK-4313
             Project: Spark
          Issue Type: Bug
          Components: Web UI, YARN
            Reporter: Shixiong Zhu
            Priority: Minor


In yarn-cluster mode, the Web UI is running behind a yarn proxy server. Some features(or bugs?) of yarn proxy server will break the links for thread dump.

1. Yarn proxy server will do http redirect internally, so if opening "http://example.com:8088/cluster/app/application_1415344371838_0012/executors", it will fetch "http://example.com:8088/cluster/app/application_1415344371838_0012/executors/" and return the content but won't change the link in the browser. Then when a user clicks "Thread Dump", it will jump to "http://example.com:8088/proxy/application_1415344371838_0012/threadDump/?executorId=2". This is a wrong link. The correct link should be "http://example.com:8088/proxy/application_1415344371838_0012/executors/threadDump/?executorId=2".

2. Yarn proxy server has a bug about the URL encode/decode. When a user accesses "http://example.com:8088/proxy/application_1415344371838_0006/executors/threadDump/?executorId=%3Cdriver%3E", the yarn proxy server will require "http://example.com:36429/executors/threadDump/?executorId=%25253Cdriver%25253E". But Spark web server expects "http://example.com:36429/executors/threadDump/?executorId=%3Cdriver%3E". I will report this issue to Hadoop community later.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org