You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Jeff Zhang (JIRA)" <ji...@apache.org> on 2016/06/16 07:36:05 UTC

[jira] [Commented] (SPARK-15909) PySpark classpath uri incorrectly set

    [ https://issues.apache.org/jira/browse/SPARK-15909?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15333289#comment-15333289 ] 

Jeff Zhang commented on SPARK-15909:
------------------------------------

If I remember correctly, pyspark can only run cluster mode in yarn. For mesos, it is client mode that's why you see spark.driver.uri point to localhost.  

> PySpark classpath uri incorrectly set
> -------------------------------------
>
>                 Key: SPARK-15909
>                 URL: https://issues.apache.org/jira/browse/SPARK-15909
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.6.1
>            Reporter: Liam Fisk
>
> PySpark behaves differently if the SparkContext is created within the REPL (vs initialised by the shell).
> My conf/spark-env.sh file contains:
> {code}
> #!/bin/bash
> export SPARK_LOCAL_IP=172.20.30.158
> export LIBPROCESS_IP=172.20.30.158
> export MESOS_NATIVE_JAVA_LIBRARY=/usr/local/lib/libmesos.so
> {code}
> And when running pyspark it will correctly initialize my SparkContext. However, when I run:
> {code}
> from pyspark import SparkContext, SparkConf
> sc.stop()
> conf = (
>     SparkConf()
>         .setMaster("mesos://zk://foo:2181/mesos")
>         .setAppName("Jupyter PySpark")
> )
> sc = SparkContext(conf=conf)
> {code}
> my _spark.driver.uri_ and URL classpath will point to localhost (preventing my mesos cluster from accessing the appropriate files)



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org