You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Josh Rosen (JIRA)" <ji...@apache.org> on 2015/09/17 03:07:46 UTC

[jira] [Commented] (SPARK-10635) pyspark - running on a different host

    [ https://issues.apache.org/jira/browse/SPARK-10635?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14791389#comment-14791389 ] 

Josh Rosen commented on SPARK-10635:
------------------------------------

[~davies], do you think we should support this? This seems like a hard-to-support feature, so I'm inclined to say that this issue is "Won't Fix" as currently described.

> pyspark - running on a different host
> -------------------------------------
>
>                 Key: SPARK-10635
>                 URL: https://issues.apache.org/jira/browse/SPARK-10635
>             Project: Spark
>          Issue Type: Improvement
>            Reporter: Ben Duffield
>
> At various points we assume we only ever talk to a driver on the same host.
> e.g. https://github.com/apache/spark/blob/v1.4.1/core/src/main/scala/org/apache/spark/api/python/PythonRDD.scala#L615
> We use pyspark to connect to an existing driver (i.e. do not let pyspark launch the driver itself, but instead construct the SparkContext with the gateway and jsc arguments.
> There are a few reasons for this, but essentially it's to allow more flexibility when running in AWS.
> Before 1.3.1 we were able to monkeypatch around this:  
> {code}
> def _load_from_socket(port, serializer):
>             sock = socket.socket()
>             sock.settimeout(3)
>             try:
>                 sock.connect((host, port))
>                 rf = sock.makefile("rb", 65536)
>                 for item in serializer.load_stream(rf):
>                     yield item
>             finally:
>                 sock.close()
> pyspark.rdd._load_from_socket = _load_from_socket
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org