You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Philipp Angerer (Jira)" <ji...@apache.org> on 2019/10/08 09:18:00 UTC
[jira] [Reopened] (SPARK-9636) Treat $SPARK_HOME as write-only
[ https://issues.apache.org/jira/browse/SPARK-9636?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Philipp Angerer reopened SPARK-9636:
------------------------------------
This is not fixed and there was no reason given why it was closed, so I’ll reopen it.
> Treat $SPARK_HOME as write-only
> -------------------------------
>
> Key: SPARK-9636
> URL: https://issues.apache.org/jira/browse/SPARK-9636
> Project: Spark
> Issue Type: Improvement
> Components: Input/Output
> Affects Versions: 1.4.1
> Environment: Linux
> Reporter: Philipp Angerer
> Priority: Minor
> Labels: bulk-closed
>
> when starting spark scripts as user and it is installed in a directory the user has no write permissions on, many things work fine, except for the logs (e.g. for {{start-master.sh}})
> logs are per default written to {{$SPARK_LOG_DIR}} or (if unset) to {{$SPARK_HOME/logs}}.
> if installed in this way, it should, instead of throwing an error, write logs to {{/var/log/spark/}}. that’s easy to fix by simply testing a few log dirs in sequence for writability before trying to use one. i suggest using {{$SPARK_LOG_DIR}} (if set) → {{/var/log/spark/}} → {{~/.cache/spark-logs/}} → {{$SPARK_HOME/logs/}}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org