You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:44:17 UTC

[jira] [Resolved] (SPARK-25537) spark.pyspark.driver.python when set in code doesnt work

     [ https://issues.apache.org/jira/browse/SPARK-25537?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-25537.
----------------------------------
    Resolution: Incomplete

> spark.pyspark.driver.python when set in code doesnt work
> --------------------------------------------------------
>
>                 Key: SPARK-25537
>                 URL: https://issues.apache.org/jira/browse/SPARK-25537
>             Project: Spark
>          Issue Type: Documentation
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Venkat Sambath
>            Priority: Minor
>              Labels: bulk-closed
>
> spark.pyspark.driver.python, spark.pyspark.python when set in code doesnt get picked up by driver or executor. It gets picked up only when set through --conf or when set in spark-defaults.conf. Can we add a line which states it is illegal to set these in application as we do for spark.driver.extraJavaOptions in the doc https://spark.apache.org/docs/latest/configuration.html#application-properties



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org