You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2020/07/21 04:46:17 UTC

[spark] branch branch-3.0 updated: [MINOR][DOCS] add link for Debugging your Application in running-on-yarn.html#launching-spark-on-yarn

This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch branch-3.0
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/branch-3.0 by this push:
     new aaee1f8  [MINOR][DOCS] add link for Debugging your Application in running-on-yarn.html#launching-spark-on-yarn
aaee1f8 is described below

commit aaee1f816ebfb79115d3151576ee7c1c9a456609
Author: Brandon <br...@users.noreply.github.com>
AuthorDate: Tue Jul 21 13:42:19 2020 +0900

    [MINOR][DOCS] add link for Debugging your Application in running-on-yarn.html#launching-spark-on-yarn
    
    ### What changes were proposed in this pull request?
     add link for Debugging your Application in `running-on-yarn.html#launching-spark-on-yar`
    
    ### Why are the changes needed?
    Currrently on running-on-yarn.html page launching-spark-on-yarn section, it mentions to refer for Debugging your Application. It is better to add a direct link for it to save reader time to find the section
      ![image](https://user-images.githubusercontent.com/20021316/87867542-80cc5500-c9c0-11ea-8560-5ddcb5a308bc.png)
    
    ### Does this PR introduce _any_ user-facing change?
    Yes.
    Docs changes.
    1. add link for Debugging your Application in `running-on-yarn.html#launching-spark-on-yarn` section
    Updated behavior:
    ![image](https://user-images.githubusercontent.com/20021316/87867534-6eeab200-c9c0-11ea-94ee-d3fa58157156.png)
    2. update Spark Properties link to anchor link only
    
    ### How was this patch tested?
    manual test has been performed to test the updated
    
    Closes #29154 from brandonJY/patch-1.
    
    Authored-by: Brandon <br...@users.noreply.github.com>
    Signed-off-by: HyukjinKwon <gu...@apache.org>
    (cherry picked from commit 1267d80db6abaa130384b8e7b514c39aec3a8c77)
    Signed-off-by: HyukjinKwon <gu...@apache.org>
---
 docs/running-on-yarn.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)

diff --git a/docs/running-on-yarn.md b/docs/running-on-yarn.md
index 166fb87..046c545 100644
--- a/docs/running-on-yarn.md
+++ b/docs/running-on-yarn.md
@@ -60,7 +60,7 @@ For example:
         examples/jars/spark-examples*.jar \
         10
 
-The above starts a YARN client program which starts the default Application Master. Then SparkPi will be run as a child thread of Application Master. The client will periodically poll the Application Master for status updates and display them in the console. The client will exit once your application has finished running.  Refer to the "Debugging your Application" section below for how to see driver and executor logs.
+The above starts a YARN client program which starts the default Application Master. Then SparkPi will be run as a child thread of Application Master. The client will periodically poll the Application Master for status updates and display them in the console. The client will exit once your application has finished running.  Refer to the [Debugging your Application](#debugging-your-application) section below for how to see driver and executor logs.
 
 To launch a Spark application in `client` mode, do the same, but replace `cluster` with `client`. The following shows how you can run `spark-shell` in `client` mode:
 
@@ -84,7 +84,7 @@ Running Spark on YARN requires a binary distribution of Spark which is built wit
 Binary distributions can be downloaded from the [downloads page](https://spark.apache.org/downloads.html) of the project website.
 To build Spark yourself, refer to [Building Spark](building-spark.html).
 
-To make Spark runtime jars accessible from YARN side, you can specify `spark.yarn.archive` or `spark.yarn.jars`. For details please refer to [Spark Properties](running-on-yarn.html#spark-properties). If neither `spark.yarn.archive` nor `spark.yarn.jars` is specified, Spark will create a zip file with all jars under `$SPARK_HOME/jars` and upload it to the distributed cache.
+To make Spark runtime jars accessible from YARN side, you can specify `spark.yarn.archive` or `spark.yarn.jars`. For details please refer to [Spark Properties](#spark-properties). If neither `spark.yarn.archive` nor `spark.yarn.jars` is specified, Spark will create a zip file with all jars under `$SPARK_HOME/jars` and upload it to the distributed cache.
 
 # Configuration
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org