You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2014/05/12 21:21:14 UTC

[jira] [Updated] (SPARK-1808) bin/pyspark does not load default configuration properties

     [ https://issues.apache.org/jira/browse/SPARK-1808?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Andrew Or updated SPARK-1808:
-----------------------------

    Description: 
... because it doesn't go through spark-submit. Either we make it go through spark-submit (hard), or we extract the load default configurations logic and set them for the JVM that launches the py4j GatewayServer (easier).

Right now, the only way to set config values for bin/pyspark is to do it through SPARK_JAVA_OPTS in spark-env.sh, which is supposedly deprecated.

  was:... because it doesn't go through spark-submit. Either we make it go through spark-submit (hard), or we extract the load default configurations logic and set them for the JVM that launches the py4j GatewayServer (easier).


> bin/pyspark does not load default configuration properties
> ----------------------------------------------------------
>
>                 Key: SPARK-1808
>                 URL: https://issues.apache.org/jira/browse/SPARK-1808
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.0
>            Reporter: Andrew Or
>             Fix For: 1.0.1
>
>
> ... because it doesn't go through spark-submit. Either we make it go through spark-submit (hard), or we extract the load default configurations logic and set them for the JVM that launches the py4j GatewayServer (easier).
> Right now, the only way to set config values for bin/pyspark is to do it through SPARK_JAVA_OPTS in spark-env.sh, which is supposedly deprecated.



--
This message was sent by Atlassian JIRA
(v6.2#6252)