You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Helena Edelson (JIRA)" <ji...@apache.org> on 2014/09/13 17:36:33 UTC
[jira] [Updated] (SPARK-2593) Add ability to pass an existing Akka
ActorSystem into Spark
[ https://issues.apache.org/jira/browse/SPARK-2593?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Helena Edelson updated SPARK-2593:
----------------------------------
Description:
As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application.
This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system.
was:
As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application.
This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system.
I would like to create an Akka Extension that wraps around Spark/Spark Streaming and Cassandra. So the programmatic creation would simply be this for a user
val extension = SparkCassandra(system)
> Add ability to pass an existing Akka ActorSystem into Spark
> -----------------------------------------------------------
>
> Key: SPARK-2593
> URL: https://issues.apache.org/jira/browse/SPARK-2593
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Reporter: Helena Edelson
>
> As a developer I want to pass an existing ActorSystem into StreamingContext in load-time so that I do not have 2 actor systems running on a node in an Akka application.
> This would mean having spark's actor system on its own named-dispatchers as well as exposing the new private creation of its own actor system.
>
>
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org