You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shiti Saxena (JIRA)" <ji...@apache.org> on 2014/11/05 09:35:33 UTC
[jira] [Updated] (SPARK-4171) StreamingContext.actorStream throws
serializationError
[ https://issues.apache.org/jira/browse/SPARK-4171?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shiti Saxena updated SPARK-4171:
--------------------------------
Component/s: Streaming
> StreamingContext.actorStream throws serializationError
> ------------------------------------------------------
>
> Key: SPARK-4171
> URL: https://issues.apache.org/jira/browse/SPARK-4171
> Project: Spark
> Issue Type: Bug
> Components: Streaming
> Affects Versions: 1.1.0, 1.2.0
> Reporter: Shiti Saxena
>
> I encountered this issue when I was working on https://issues.apache.org/jira/browse/SPARK-3872.
> Running the following test case on v1.1.0 and the master branch(v1.2.0-SNAPSHOT) throws a serialization error.
> {noformat}
> test("actor input stream") {
> // Set up the streaming context and input streams
> val ssc = new StreamingContext(conf, batchDuration)
> val networkStream = ssc.actorStream[String](EchoActor.props, "TestActor",
> // Had to pass the local value of port to prevent from closing over entire scope
> StorageLevel.MEMORY_AND_DISK)
> println("created actor")
> networkStream.print()
> ssc.start()
> Thread.sleep(3 * 1000)
> println("started stream")
> Thread.sleep(3*1000)
> logInfo("Stopping server")
> logInfo("Stopping context")
> ssc.stop()
> }
> {noformat}
> where EchoActor is defined as
> {noformat}
> class EchoActor extends Actor with ActorHelper {
> override def receive = {
> case message => sender ! message
> }
> }
> object EchoActor {
> def props: Props = Props(new EchoActor())
> }
> {noformat}
> The same code works with v1.0.1
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org