You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Philipp Angerer (JIRA)" <ji...@apache.org> on 2015/08/05 12:25:04 UTC

[jira] [Created] (SPARK-9636) Treat $SPARK_HOME as write-only

Philipp Angerer created SPARK-9636:
--------------------------------------

             Summary: Treat $SPARK_HOME as write-only
                 Key: SPARK-9636
                 URL: https://issues.apache.org/jira/browse/SPARK-9636
             Project: Spark
          Issue Type: Bug
          Components: Input/Output
    Affects Versions: 1.4.1
         Environment: Linux
            Reporter: Philipp Angerer


when starting spark scripts as user and it is installed in a directory the user has no write permissions on, many things work fine, except for the logs (e.g. for {{start-master.sh}})

logs are per default written to {{$SPARK_LOG_DIR}} or (if unset) to {{$SPARK_HOME/logs}}.

if installed in this way, it should, instead of throwing an error, write logs to {{/var/log/spark/}}. that’s easy to fix by simply testing a few log dirs in sequence for writability before trying to use one. i suggest using {{$SPARK_LOG_DIR}} (if set) → {{/var/log/spark/}} → {{~/.cache/spark-logs/}} → {{$SPARK_HOME/logs/}}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org