You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Ryan D Braley (JIRA)" <ji...@apache.org> on 2014/09/12 11:48:33 UTC

[jira] [Commented] (SPARK-2593) Add ability to pass an existing Akka ActorSystem into Spark

    [ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14131333#comment-14131333 ] 

Ryan D Braley commented on SPARK-2593:
--------------------------------------

This would be quite useful. It is hard to use actorStream with spark streaming where you have remote actors sending to spark because we need two actor systems. Right now it seems the name of the actor system in spark is hardcoded to "spark". In order for actors to join an akka cluster we need to have the actor systems share the same name. Thus it is currently difficult to distribute work from an external actor system to the spark cluster without this change.

> Add ability to pass an existing Akka ActorSystem into Spark
> -----------------------------------------------------------
>
>                 Key: SPARK-2593
>                 URL: https://issues.apache.org/jira/browse/SPARK-2593
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Helena Edelson
>
> As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application.
> This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system.
>  
> I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the programmatic creation would simply be this for a user
> val extension = SparkCassandra(system)
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org