You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "holdenk (JIRA)" <ji...@apache.org> on 2018/08/24 21:24:00 UTC

[jira] [Updated] (SPARK-9636) Treat $SPARK_HOME as write-only

     [ https://issues.apache.org/jira/browse/SPARK-9636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

holdenk updated SPARK-9636:
---------------------------
    Labels:   (was: easyfix)

> Treat $SPARK_HOME as write-only
> -------------------------------
>
>                 Key: SPARK-9636
>                 URL: https://issues.apache.org/jira/browse/SPARK-9636
>             Project: Spark
>          Issue Type: Improvement
>          Components: Input/Output
>    Affects Versions: 1.4.1
>         Environment: Linux
>            Reporter: Philipp Angerer
>            Priority: Minor
>
> when starting spark scripts as user and it is installed in a directory the user has no write permissions on, many things work fine, except for the logs (e.g. for {{start-master.sh}})
> logs are per default written to {{$SPARK_LOG_DIR}} or (if unset) to {{$SPARK_HOME/logs}}.
> if installed in this way, it should, instead of throwing an error, write logs to {{/var/log/spark/}}. that’s easy to fix by simply testing a few log dirs in sequence for writability before trying to use one. i suggest using {{$SPARK_LOG_DIR}} (if set) → {{/var/log/spark/}} → {{~/.cache/spark-logs/}} → {{$SPARK_HOME/logs/}}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org