You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/08/22 08:54:21 UTC

[jira] [Assigned] (SPARK-16781) java launched by PySpark as gateway may not be the same java used in the spark environment

     [ https://issues.apache.org/jira/browse/SPARK-16781?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-16781:
------------------------------------

    Assignee: Apache Spark

> java launched by PySpark as gateway may not be the same java used in the spark environment
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-16781
>                 URL: https://issues.apache.org/jira/browse/SPARK-16781
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.6.2
>            Reporter: Michael Berman
>            Assignee: Apache Spark
>
> When launching spark on a system with multiple javas installed, there are a few options for choosing which JRE to use, setting `JAVA_HOME` being the most straightforward.
> However, when pyspark's internal py4j launches its JavaGateway, it always invokes `java` directly, without qualification. This means you get whatever java's first on your path, which is not necessarily the same one in spark's JAVA_HOME.
> This could be seen as a py4j issue, but from their point of view, the fix is easy: make sure the java you want is first on your path. I can't figure out a way to make that reliably happen through the pyspark executor launch path, and it seems like something that would ideally happen automatically. If I set JAVA_HOME when launching spark, I would expect that to be the only java used throughout the stack.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org