You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/11/25 21:42:12 UTC

[jira] [Commented] (SPARK-4605) Proposed Contribution: Spark Kernel to enable interactive Spark applications

    [ https://issues.apache.org/jira/browse/SPARK-4605?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14225173#comment-14225173 ] 

Patrick Wendell commented on SPARK-4605:
----------------------------------------

Thanks for sharing this design doc. Having a shared execution context like this has come up here and there over the course of Spark and there are various community projects such as the JobServer that provide pieces of this. As you get towards more concrete designs here, it would be helpful if you could focus on what changes would be needed in spark core to support the proposed design. Can this be deployed as a Spark application today - or are there gaps? We want Spark's core API to be rich enough that projects like this can exist outside of Spark and develop community momentum on their own. This is increasingly he pattern we've used for major augmentations of Sparks functionality and I think it's worked well so far (e.g. with SparkR and other projects).

> Proposed Contribution: Spark Kernel to enable interactive Spark applications
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-4605
>                 URL: https://issues.apache.org/jira/browse/SPARK-4605
>             Project: Spark
>          Issue Type: New Feature
>            Reporter: Chip Senkbeil
>         Attachments: Kernel Architecture Widescreen.pdf, Kernel Architecture.pdf
>
>
> This architecture is describing running code that was demonstrated at the StrataConf in Barcelona, Spain.
> ----
> Enables applications to interact with a Spark cluster using Scala in several ways:
> * Defining and running core Spark Tasks
> * Collecting results from a cluster without needing to write to external data store
> ** Ability to stream results using well-defined protocol
> * Arbitrary Scala code definition and execution (without submitting heavy-weight jars)
> Applications can be hosted and managed separate from the Spark cluster using the kernel as a proxy to communicate requests.
> The Spark Kernel implements the server side of the IPython Kernel protocol, the rising “de-facto” protocol for language (Python, Haskell, etc.) execution.
> Inherits a suite of industry adopted clients such as the IPython Notebook.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org