You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/10/12 19:18:33 UTC

[jira] [Commented] (SPARK-2593) Add ability to pass an existing Akka ActorSystem into Spark

    [ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14168727#comment-14168727 ] 

Patrick Wendell commented on SPARK-2593:
----------------------------------------

Yeah for Spark Streaming the API visibility is not an issue because we are explicitly exposing Akka as an API (and it's an add-on connector). I still don't quite understand if the model of akka is that different applications are supposed to share actor systems. We rely heavily on customization of the akka configuration and if a user passes in an ActorSystem it will break assumptions inside of Spark - so I think allowing users to pass in an ActorSystem is going to be difficult. Exposing our ActorSystem in Spark Streaming seems more reasonable though, since the configurations are immutable at that point. Other things like providing better naming, etc, that stuff makes a lot of sense.

> Add ability to pass an existing Akka ActorSystem into Spark
> -----------------------------------------------------------
>
>                 Key: SPARK-2593
>                 URL: https://issues.apache.org/jira/browse/SPARK-2593
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Helena Edelson
>
> As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application.
> This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system.
>   
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org