You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/08/24 19:06:20 UTC

[jira] [Updated] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

     [ https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen updated SPARK-16781:
------------------------------
    Assignee: Sean Owen

Updated to 0.10.3 to enable this to pick up JAVA_HOME

> java launched by PySpark as gateway may not be the same java used in the spark environment
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16781
>                 URL: https://issues.apache.org/jira/browse/SPARK-16781
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.6.2
>            Reporter: Michael Berman
>            Assignee: Sean Owen
>             Fix For: 2.0.1, 2.1.0
>
>
> When launching spark on a system with multiple javas installed, there are a few options for choosing which JRE to use, setting `JAVA_HOME` being the most straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always invokes `java` directly, without qualification. This means you get whatever java's first on your path, which is not necessarily the same one in spark's JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is easy: make sure the java you want is first on your path. I can't figure out a way to make that reliably happen through the pyspark executor launch path, and it seems like something that would ideally happen automatically. If I set JAVA_HOME when launching spark, I would expect that to be the only java used throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org