You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Marcelo Vanzin (JIRA)" <ji...@apache.org> on 2014/08/26 00:30:57 UTC

[jira] [Updated] (SPARK-3215) Add remote interface for SparkContext

     [ https://issues.apache.org/jira/browse/SPARK-3215?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Marcelo Vanzin updated SPARK-3215:
----------------------------------

    Attachment: RemoteSparkContext.pdf

Initial proposal for a remote context interface.

Note that this is not a formal design document, just a high-level proposal, so it doesn't go deeply into what APIs would be exposed on anything like that.

> Add remote interface for SparkContext
> -------------------------------------
>
>                 Key: SPARK-3215
>                 URL: https://issues.apache.org/jira/browse/SPARK-3215
>             Project: Spark
>          Issue Type: New Feature
>          Components: Spark Core
>            Reporter: Marcelo Vanzin
>              Labels: hive
>         Attachments: RemoteSparkContext.pdf
>
>
> A quick description of the issue: as part of running Hive jobs on top of Spark, it's desirable to have a SparkContext that is running in the background and listening for job requests for a particular user session.
> Running multiple contexts in the same JVM is not a very good solution. Not only SparkContext currently has issues sharing the same JVM among multiple instances, but that turns the JVM running the contexts into a huge bottleneck in the system.
> So I'm proposing a solution where we have a SparkContext that is running in a separate process, and listening for requests from the client application via some RPC interface (most probably Akka).
> I'll attach a document shortly with the current proposal. Let's use this bug to discuss the proposal and any other suggestions.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org