You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by sparkdi <sh...@dubna.us> on 2015/03/30 20:26:22 UTC

Re: Actor not found

I have the same problem, i.e. exception with the same call stack when I start
either pyspark or spark-shell. I use spark-1.3.0-bin-hadoop2.4 on ubuntu
14.10. 
bin/pyspark

bunch of INFO messages, then ActorInitializationException exception.
Shell starts, I can do this:
>>> rd = sc.parallelize([1,2])
>>> rd.first()
This call does not return.
Also if I start master, and then I tried to connect shell to the master it
fails to connect complaining about master URL.

The same tar works fine on windows.

Maybe some linux versions are not supported?
Thank you
Dima



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22300.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Actor not found

Posted by Shixiong Zhu <zs...@gmail.com>.
Thanks for the log. It's really helpful. I created a JIRA to explain why it
will happen: https://issues.apache.org/jira/browse/SPARK-6640

However, will this error always happens in your environment?

Best Regards,
Shixiong Zhu

2015-03-31 22:36 GMT+08:00 sparkdi <sh...@dubna.us>:

> This is the whole output from the shell:
>
> ~/spark-1.3.0-bin-hadoop2.4$ sudo bin/spark-shell
> Spark assembly has been built with Hive, including Datanucleus jars on
> classpath
> log4j:WARN No appenders could be found for logger
> (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
> more info.
> Using Spark's default log4j profile:
> org/apache/spark/log4j-defaults.properties
> 15/03/30 19:00:40 INFO SecurityManager: Changing view acls to: root
> 15/03/30 19:00:40 INFO SecurityManager: Changing modify acls to: root
> 15/03/30 19:00:40 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view pe
> rmissions: Set(root); users with modify permissions: Set(root)
> 15/03/30 19:00:40 INFO HttpServer: Starting HTTP Server
> 15/03/30 19:00:40 INFO Server: jetty-8.y.z-SNAPSHOT
> 15/03/30 19:00:40 INFO AbstractConnector: Started
> SocketConnector@0.0.0.0:47797
> 15/03/30 19:00:40 INFO Utils: Successfully started service 'HTTP class
> server' on port 47797.
> Welcome to
>       ____              __
>      / __/__  ___ _____/ /__
>     _\ \/ _ \/ _ `/ __/  '_/
>    /___/ .__/\_,_/_/ /_/\_\   version 1.3.0
>       /_/
>
> Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
> Type in expressions to have them evaluated.
> Type :help for more information.
> 15/03/30 19:00:42 INFO SparkContext: Running Spark version 1.3.0
> 15/03/30 19:00:42 INFO SecurityManager: Changing view acls to: root
> 15/03/30 19:00:42 INFO SecurityManager: Changing modify acls to: root
> 15/03/30 19:00:42 INFO SecurityManager: SecurityManager: authentication
> disabled; ui acls disabled; users with view pe
> rmissions: Set(root); users with modify permissions: Set(root)
> 15/03/30 19:00:42 INFO Slf4jLogger: Slf4jLogger started
> 15/03/30 19:00:42 INFO Remoting: Starting remoting
> 15/03/30 19:00:43 INFO Remoting: Remoting started; listening on addresses
> :[akka.tcp://sparkDriver@vm:52574]
> 15/03/30 19:00:43 INFO Utils: Successfully started service 'sparkDriver' on
> port 52574.
> 15/03/30 19:00:43 INFO SparkEnv: Registering MapOutputTracker
> 15/03/30 19:00:43 INFO SparkEnv: Registering BlockManagerMaster
> 15/03/30 19:00:43 INFO DiskBlockManager: Created local directory at
> /tmp/spark-f71a8d86-6e49-4dfe-bb98-8e8581015acc/bl
> ockmgr-57532f5a-38db-4ba3-86d8-edef84f592e5
> 15/03/30 19:00:43 INFO MemoryStore: MemoryStore started with capacity 265.4
> MB
> 15/03/30 19:00:43 INFO HttpFileServer: HTTP File server directory is
> /tmp/spark-95e0a143-0de3-4c96-861c-968c9fae2746/h
> ttpd-cb029cd6-4943-479d-9b56-e7397489d9ea
> 15/03/30 19:00:43 INFO HttpServer: Starting HTTP Server
> 15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT
> 15/03/30 19:00:43 INFO AbstractConnector: Started
> SocketConnector@0.0.0.0:48500
> 15/03/30 19:00:43 INFO Utils: Successfully started service 'HTTP file
> server' on port 48500.
> 15/03/30 19:00:43 INFO SparkEnv: Registering OutputCommitCoordinator
> 15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT
> 15/03/30 19:00:43 INFO AbstractConnector: Started
> SelectChannelConnector@0.0.0.0:4040
> 15/03/30 19:00:43 INFO Utils: Successfully started service 'SparkUI' on
> port
> 4040.
> 15/03/30 19:00:43 INFO SparkUI: Started SparkUI at http://vm:4040
> 15/03/30 19:00:43 INFO Executor: Starting executor ID <driver> on host
> localhost
> 15/03/30 19:00:43 INFO Executor: Using REPL class URI:
> http://10.11.204.80:47797
> 15/03/30 19:00:43 INFO AkkaUtils: Connecting to HeartbeatReceiver:
> akka.tcp://sparkDriver@vm:5
> 2574/user/HeartbeatReceiver
> 15/03/30 19:00:43 ERROR OneForOneStrategy: Actor not found for:
> ActorSelection[Anchor(akka://sparkDriver/deadLetters),
> Path(/)]
> akka.actor.ActorInitializationException: exception during creation
>         at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
>         at akka.actor.ActorCell.create(ActorCell.scala:596)
>         at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
>         at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
>         at
> akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
>         at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>         at
>
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>         at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>         at
>
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>         at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>         at
>
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
> Caused by: akka.actor.ActorNotFound: Actor not found for:
> ActorSelection[Anchor(akka://sparkDriver/deadLetters), Path(
> /)]
>         at
>
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
>         at
>
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
>         at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
>         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
>         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>         at
> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>         at
> akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
>         at
>
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
>         at
> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
>         at
>
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
>         at
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>         at
>
> scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCal
> lback(Promise.scala:280)
>         at
> scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)
>         at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:63)
>         at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:80)
>         at
> org.apache.spark.util.AkkaUtils$.makeDriverRef(AkkaUtils.scala:221)
>         at
>
> org.apache.spark.executor.Executor.startDriverHeartbeater(Executor.scala:393)
>         at org.apache.spark.executor.Executor.<init>(Executor.scala:119)
>         at
> org.apache.spark.scheduler.local.LocalActor.<init>(LocalBackend.scala:58)
>         at
>
> org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107)
>         at
>
> org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107)
>         at akka.actor.TypedCreatorFunctionConsumer.produce(Props.scala:343)
>         at akka.actor.Props.newActor(Props.scala:252)
>         at akka.actor.ActorCell.newActor(ActorCell.scala:552)
>         at akka.actor.ActorCell.create(ActorCell.scala:578)
>         ... 9 more
> 15/03/30 19:00:43 INFO NettyBlockTransferService: Server created on 58205
> 15/03/30 19:00:43 INFO BlockManagerMaster: Trying to register BlockManager
> 15/03/30 19:00:43 INFO BlockManagerMasterActor: Registering block manager
> localhost:58205 with 265.4 MB RAM, BlockMana
> gerId(<driver>, localhost, 58205)
> 15/03/30 19:00:43 INFO BlockManagerMaster: Registered BlockManager
> 15/03/30 19:00:43 INFO SparkILoop: Created spark context..
> Spark context available as sc.
> 15/03/30 19:00:43 INFO SparkILoop: Created sql context (with Hive
> support)..
> SQL context available as sqlContext.
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22324.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Actor not found

Posted by sparkdi <sh...@dubna.us>.
This is the whole output from the shell:

~/spark-1.3.0-bin-hadoop2.4$ sudo bin/spark-shell
Spark assembly has been built with Hive, including Datanucleus jars on
classpath
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
Using Spark's default log4j profile:
org/apache/spark/log4j-defaults.properties
15/03/30 19:00:40 INFO SecurityManager: Changing view acls to: root
15/03/30 19:00:40 INFO SecurityManager: Changing modify acls to: root
15/03/30 19:00:40 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view pe                                                                                                                      
rmissions: Set(root); users with modify permissions: Set(root)
15/03/30 19:00:40 INFO HttpServer: Starting HTTP Server
15/03/30 19:00:40 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/30 19:00:40 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:47797
15/03/30 19:00:40 INFO Utils: Successfully started service 'HTTP class
server' on port 47797.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 1.3.0
      /_/

Using Scala version 2.10.4 (OpenJDK 64-Bit Server VM, Java 1.7.0_75)
Type in expressions to have them evaluated.
Type :help for more information.
15/03/30 19:00:42 INFO SparkContext: Running Spark version 1.3.0
15/03/30 19:00:42 INFO SecurityManager: Changing view acls to: root
15/03/30 19:00:42 INFO SecurityManager: Changing modify acls to: root
15/03/30 19:00:42 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view pe                                                                                                                      
rmissions: Set(root); users with modify permissions: Set(root)
15/03/30 19:00:42 INFO Slf4jLogger: Slf4jLogger started
15/03/30 19:00:42 INFO Remoting: Starting remoting
15/03/30 19:00:43 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://sparkDriver@vm:52574]
15/03/30 19:00:43 INFO Utils: Successfully started service 'sparkDriver' on
port 52574.
15/03/30 19:00:43 INFO SparkEnv: Registering MapOutputTracker
15/03/30 19:00:43 INFO SparkEnv: Registering BlockManagerMaster
15/03/30 19:00:43 INFO DiskBlockManager: Created local directory at
/tmp/spark-f71a8d86-6e49-4dfe-bb98-8e8581015acc/bl                                                                                                                      
ockmgr-57532f5a-38db-4ba3-86d8-edef84f592e5
15/03/30 19:00:43 INFO MemoryStore: MemoryStore started with capacity 265.4
MB
15/03/30 19:00:43 INFO HttpFileServer: HTTP File server directory is
/tmp/spark-95e0a143-0de3-4c96-861c-968c9fae2746/h                                                                                                                      
ttpd-cb029cd6-4943-479d-9b56-e7397489d9ea
15/03/30 19:00:43 INFO HttpServer: Starting HTTP Server
15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/30 19:00:43 INFO AbstractConnector: Started
SocketConnector@0.0.0.0:48500
15/03/30 19:00:43 INFO Utils: Successfully started service 'HTTP file
server' on port 48500.
15/03/30 19:00:43 INFO SparkEnv: Registering OutputCommitCoordinator
15/03/30 19:00:43 INFO Server: jetty-8.y.z-SNAPSHOT
15/03/30 19:00:43 INFO AbstractConnector: Started
SelectChannelConnector@0.0.0.0:4040
15/03/30 19:00:43 INFO Utils: Successfully started service 'SparkUI' on port
4040.
15/03/30 19:00:43 INFO SparkUI: Started SparkUI at http://vm:4040
15/03/30 19:00:43 INFO Executor: Starting executor ID <driver> on host
localhost
15/03/30 19:00:43 INFO Executor: Using REPL class URI:
http://10.11.204.80:47797
15/03/30 19:00:43 INFO AkkaUtils: Connecting to HeartbeatReceiver:
akka.tcp://sparkDriver@vm:5                                                                                                                      
2574/user/HeartbeatReceiver
15/03/30 19:00:43 ERROR OneForOneStrategy: Actor not found for:
ActorSelection[Anchor(akka://sparkDriver/deadLetters),                                                                                                                       
Path(/)]
akka.actor.ActorInitializationException: exception during creation
        at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
        at akka.actor.ActorCell.create(ActorCell.scala:596)
        at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
        at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
        at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
        at akka.dispatch.Mailbox.run(Mailbox.scala:219)
        at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
        at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
        at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
        at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
        at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: akka.actor.ActorNotFound: Actor not found for:
ActorSelection[Anchor(akka://sparkDriver/deadLetters), Path(                                                                                                                      
/)]
        at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
        at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
        at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
        at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
        at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
        at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
        at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
        at
scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
        at
akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
        at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
        at
akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
        at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
        at
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
        at
scala.concurrent.impl.Promise$DefaultPromise.scala$concurrent$impl$Promise$DefaultPromise$$dispatchOrAddCal                                                                                                                      
lback(Promise.scala:280)
        at
scala.concurrent.impl.Promise$DefaultPromise.onComplete(Promise.scala:270)
        at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:63)
        at akka.actor.ActorSelection.resolveOne(ActorSelection.scala:80)
        at
org.apache.spark.util.AkkaUtils$.makeDriverRef(AkkaUtils.scala:221)
        at
org.apache.spark.executor.Executor.startDriverHeartbeater(Executor.scala:393)
        at org.apache.spark.executor.Executor.<init>(Executor.scala:119)
        at
org.apache.spark.scheduler.local.LocalActor.<init>(LocalBackend.scala:58)
        at
org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107)
        at
org.apache.spark.scheduler.local.LocalBackend$$anonfun$start$1.apply(LocalBackend.scala:107)
        at akka.actor.TypedCreatorFunctionConsumer.produce(Props.scala:343)
        at akka.actor.Props.newActor(Props.scala:252)
        at akka.actor.ActorCell.newActor(ActorCell.scala:552)
        at akka.actor.ActorCell.create(ActorCell.scala:578)
        ... 9 more
15/03/30 19:00:43 INFO NettyBlockTransferService: Server created on 58205
15/03/30 19:00:43 INFO BlockManagerMaster: Trying to register BlockManager
15/03/30 19:00:43 INFO BlockManagerMasterActor: Registering block manager
localhost:58205 with 265.4 MB RAM, BlockMana                                                                                                                      
gerId(<driver>, localhost, 58205)
15/03/30 19:00:43 INFO BlockManagerMaster: Registered BlockManager
15/03/30 19:00:43 INFO SparkILoop: Created spark context..
Spark context available as sc.
15/03/30 19:00:43 INFO SparkILoop: Created sql context (with Hive support)..
SQL context available as sqlContext.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22324.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: Actor not found

Posted by Shixiong Zhu <zs...@gmail.com>.
Could you paste the whole stack trace here?

Best Regards,
Shixiong Zhu

2015-03-31 2:26 GMT+08:00 sparkdi <sh...@dubna.us>:

> I have the same problem, i.e. exception with the same call stack when I
> start
> either pyspark or spark-shell. I use spark-1.3.0-bin-hadoop2.4 on ubuntu
> 14.10.
> bin/pyspark
>
> bunch of INFO messages, then ActorInitializationException exception.
> Shell starts, I can do this:
> >>> rd = sc.parallelize([1,2])
> >>> rd.first()
> This call does not return.
> Also if I start master, and then I tried to connect shell to the master it
> fails to connect complaining about master URL.
>
> The same tar works fine on windows.
>
> Maybe some linux versions are not supported?
> Thank you
> Dima
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22300.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Actor not found

Posted by Zhihang Fan <ca...@gmail.com>.
Hi, Shixiong:
     Actually, I know nothing about this exception. I submitted a job that
would read about 2.5T data and it threw this exception. And also I try to
submit some jobs that can run successfully before this submission, it also
failed with the same exception.
Hope this will help you to do the troubleshooting.
PS: spark 1.3.0 hadoop version: 2.3.0-cdh5.1.0.

All my best,
Zhihang Fan

2015-04-17 16:59 GMT+08:00 Shixiong Zhu <zs...@gmail.com>:

> Forgot this one: I cannot find any issue about creating
> OutputCommitCoordinator. The order of creating OutputCommitCoordinato looks
> right.
>
> Best Regards,
> Shixiong(Ryan) Zhu
>
> 2015-04-17 16:57 GMT+08:00 Shixiong Zhu <zs...@gmail.com>:
>
>> I just checked the codes about creating OutputCommitCoordinator. Could
>> you reproduce this issue? If so, could you provide details about how to
>> reproduce it?
>>
>> Best Regards,
>> Shixiong(Ryan) Zhu
>>
>> 2015-04-16 13:27 GMT+08:00 Canoe <ca...@gmail.com>:
>>
>>> 13119 Exception in thread "main" akka.actor.ActorNotFound: Actor not
>>> found
>>> for: ActorSelection[Anchor(akka.tcp://sparkDriver@dmslave13.et2.tbsi
>>> te.net:5908/), Path(/user/OutputCommitCoordinator)]
>>> 13120         at
>>>
>>> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
>>> 13121         at
>>>
>>> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
>>> 13122         at
>>> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>>> 13123         at
>>>
>>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
>>> 13124         at
>>>
>>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
>>> 13125         at
>>>
>>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>>> 13126         at
>>>
>>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>>> 13127         at
>>> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>>> 13128         at
>>> akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
>>> 13129         at
>>>
>>> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
>>> 13130         at
>>> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
>>> 13131         at
>>>
>>> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
>>> 13132         at
>>> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>>> 13133         at
>>>
>>> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>>> 13134         at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:267)
>>> 13135         at
>>> akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:508)
>>> 13136         at
>>> akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:541)
>>> 13137         at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:531)
>>> 13138         at
>>>
>>> akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
>>> 13139         at
>>> akka.remote.EndpointManager$$anonfun$1.applyOrElse(Remoting.scala:575)
>>> 13140         at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>>> 13141         at
>>> akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
>>> 13142         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>>> 13143         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>>> 13144         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>>> 13145         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>>> 13146         at
>>>
>>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>>> 13147         at
>>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>>> 13148         at
>>>
>>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>>> 13149         at
>>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>>> 13150         at
>>>
>>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>>
>>>
>>> I met the same problem when I run spark on yarn. Is this a bug or what ?
>>>
>>>
>>>
>>>
>>> --
>>> View this message in context:
>>> http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22508.html
>>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>>
>>> ---------------------------------------------------------------------
>>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>>> For additional commands, e-mail: user-help@spark.apache.org
>>>
>>>
>>
>


-- 
谁谓河广,一苇航之

Re: Actor not found

Posted by Shixiong Zhu <zs...@gmail.com>.
Forgot this one: I cannot find any issue about creating
OutputCommitCoordinator. The order of creating OutputCommitCoordinato looks
right.

Best Regards,
Shixiong(Ryan) Zhu

2015-04-17 16:57 GMT+08:00 Shixiong Zhu <zs...@gmail.com>:

> I just checked the codes about creating OutputCommitCoordinator. Could you
> reproduce this issue? If so, could you provide details about how to
> reproduce it?
>
> Best Regards,
> Shixiong(Ryan) Zhu
>
> 2015-04-16 13:27 GMT+08:00 Canoe <ca...@gmail.com>:
>
>> 13119 Exception in thread "main" akka.actor.ActorNotFound: Actor not found
>> for: ActorSelection[Anchor(akka.tcp://sparkDriver@dmslave13.et2.tbsi
>> te.net:5908/), Path(/user/OutputCommitCoordinator)]
>> 13120         at
>>
>> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
>> 13121         at
>>
>> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
>> 13122         at
>> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
>> 13123         at
>>
>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
>> 13124         at
>>
>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
>> 13125         at
>>
>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>> 13126         at
>>
>> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
>> 13127         at
>> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
>> 13128         at
>> akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
>> 13129         at
>>
>> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
>> 13130         at
>> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
>> 13131         at
>>
>> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
>> 13132         at
>> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
>> 13133         at
>>
>> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
>> 13134         at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:267)
>> 13135         at
>> akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:508)
>> 13136         at
>> akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:541)
>> 13137         at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:531)
>> 13138         at
>>
>> akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
>> 13139         at
>> akka.remote.EndpointManager$$anonfun$1.applyOrElse(Remoting.scala:575)
>> 13140         at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
>> 13141         at
>> akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
>> 13142         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
>> 13143         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
>> 13144         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
>> 13145         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
>> 13146         at
>>
>> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
>> 13147         at
>> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>> 13148         at
>>
>> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
>> 13149         at
>> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>> 13150         at
>>
>> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>>
>>
>> I met the same problem when I run spark on yarn. Is this a bug or what ?
>>
>>
>>
>>
>> --
>> View this message in context:
>> http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22508.html
>> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
>> For additional commands, e-mail: user-help@spark.apache.org
>>
>>
>

Re: Actor not found

Posted by Shixiong Zhu <zs...@gmail.com>.
I just checked the codes about creating OutputCommitCoordinator. Could you
reproduce this issue? If so, could you provide details about how to
reproduce it?

Best Regards,
Shixiong(Ryan) Zhu

2015-04-16 13:27 GMT+08:00 Canoe <ca...@gmail.com>:

> 13119 Exception in thread "main" akka.actor.ActorNotFound: Actor not found
> for: ActorSelection[Anchor(akka.tcp://sparkDriver@dmslave13.et2.tbsi
> te.net:5908/), Path(/user/OutputCommitCoordinator)]
> 13120         at
>
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
> 13121         at
>
> akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
> 13122         at
> scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
> 13123         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
> 13124         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
> 13125         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
> 13126         at
>
> akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
> 13127         at
> scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
> 13128         at
> akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
> 13129         at
>
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
> 13130         at
> akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
> 13131         at
>
> akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
> 13132         at
> scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
> 13133         at
> scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
> 13134         at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:267)
> 13135         at
> akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:508)
> 13136         at
> akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:541)
> 13137         at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:531)
> 13138         at
>
> akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
> 13139         at
> akka.remote.EndpointManager$$anonfun$1.applyOrElse(Remoting.scala:575)
> 13140         at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
> 13141         at
> akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
> 13142         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
> 13143         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
> 13144         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
> 13145         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
> 13146         at
>
> akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
> 13147         at
> scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
> 13148         at
>
> scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
> 13149         at
> scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
> 13150         at
>
> scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
>
>
> I met the same problem when I run spark on yarn. Is this a bug or what ?
>
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22508.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
> For additional commands, e-mail: user-help@spark.apache.org
>
>

Re: Actor not found

Posted by Canoe <ca...@gmail.com>.
13119 Exception in thread "main" akka.actor.ActorNotFound: Actor not found
for: ActorSelection[Anchor(akka.tcp://sparkDriver@dmslave13.et2.tbsi     
te.net:5908/), Path(/user/OutputCommitCoordinator)]
13120         at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:65)
13121         at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:63)
13122         at
scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
13123         at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
13124         at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
13125         at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
13126         at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
13127         at
scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
13128         at
akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
13129         at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
13130         at
akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
13131         at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
13132         at
scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
13133         at
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
13134         at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:267)
13135         at
akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:508)
13136         at
akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:541)
13137         at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:531)
13138         at
akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:87)
13139         at
akka.remote.EndpointManager$$anonfun$1.applyOrElse(Remoting.scala:575)
13140         at akka.actor.Actor$class.aroundReceive(Actor.scala:465)
13141         at
akka.remote.EndpointManager.aroundReceive(Remoting.scala:395)
13142         at akka.actor.ActorCell.receiveMessage(ActorCell.scala:516)
13143         at akka.actor.ActorCell.invoke(ActorCell.scala:487)
13144         at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:238)
13145         at akka.dispatch.Mailbox.run(Mailbox.scala:220)
13146         at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
13147         at
scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
13148         at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
13149         at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
13150         at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)


I met the same problem when I run spark on yarn. Is this a bug or what ? 




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Actor-not-found-tp22265p22508.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org