You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Andrew Milkowski <am...@gmail.com> on 2014/07/28 20:42:35 UTC

akka.tcp://spark@localhost:7077/user/MapOutputTracker akka.actor.ActorNotFound

Hello community

Using following distros:

spark:
http://archive.cloudera.com/cdh5/cdh/5/spark-1.0.0-cdh5.1.0-src.tar.gz
mesos: http://archive.apache.org/dist/mesos/0.19.0/mesos-0.19.0.tar.gz

both assembled with with scala 2.10.4 and java 7

my

#!/usr/bin/env bash

my spark-env.sh looks as follows:

export SCALA_HOME=/opt/local/src/scala/scala-2.10.4
export
MESOS_NATIVE_LIBRARY=/opt/local/src/mesos/mesos-0.19.0/dist/lib/libmesos.so
export
SPARK_EXECUTOR_URI=hdfs://localhost:8020/spark/spark-1.0.0-cdh5.1.0-bin-2.3.0-cdh5.0.3.tgz
export
HADOOP_CONF_DIR=/opt/local/cloudera/hadoop/cdh5/hadoop-2.3.0-cdh5.0.3/etc/hadoop
export STANDALONE_SPARK_MASTER_HOST=192.168.122.1

export MASTER=mesos://192.168.122.1
export SPARK_MASTER_IP=192.168.122.1
export SPARK_LOCAL_IP=192.168.122.1

When I run a sample spark job I get (below)

thanks in advance for explanation/fix to the exception

Note if I run spark job on spark by itself (or hadoop yarn) job runs
without any problem


WARNING: Logging before InitGoogleLogging() is written to STDERR
I0728 14:33:52.421203 19678 fetcher.cpp:73] Fetching URI
'hdfs://localhost:8020/spark/spark-1.0.0-cdh5.1.0-bin-2.3.0-cdh5.0.3.tgz'
I0728 14:33:52.421346 19678 fetcher.cpp:102] Downloading resource from
'hdfs://localhost:8020/spark/spark-1.0.0-cdh5.1.0-bin-2.3.0-cdh5.0.3.tgz'
to
'/tmp/mesos/slaves/20140724-134606-16777343-5050-25095-0/frameworks/20140728-143300-24815808-5050-19059-0000/executors/20140724-134606-16777343-5050-25095-0/runs/c9c9eaa2-b722-4215-a35a-dc1c353963b9/spark-1.0.0-cdh5.1.0-bin-2.3.0-cdh5.0.3.tgz'
I0728 14:33:58.201438 19678 fetcher.cpp:61] Extracted resource
'/tmp/mesos/slaves/20140724-134606-16777343-5050-25095-0/frameworks/20140728-143300-24815808-5050-19059-0000/executors/20140724-134606-16777343-5050-25095-0/runs/c9c9eaa2-b722-4215-a35a-dc1c353963b9/spark-1.0.0-cdh5.1.0-bin-2.3.0-cdh5.0.3.tgz'
into
'/tmp/mesos/slaves/20140724-134606-16777343-5050-25095-0/frameworks/20140728-143300-24815808-5050-19059-0000/executors/20140724-134606-16777343-5050-25095-0/runs/c9c9eaa2-b722-4215-a35a-dc1c353963b9'
Spark assembly has been built with Hive, including Datanucleus jars on
classpath
log4j:WARN No appenders could be found for logger
(org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for
more info.
14/07/28 14:33:59 INFO SparkHadoopUtil: Using Spark's default log4j
profile: org/apache/spark/log4j-defaults.properties
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0728 14:33:59.896520 19785 exec.cpp:131] Version: 0.19.0
I0728 14:33:59.899474 19805 exec.cpp:205] Executor registered on slave
20140724-134606-16777343-5050-25095-0
14/07/28 14:33:59 INFO MesosExecutorBackend: Registered with Mesos as
executor ID 20140724-134606-16777343-5050-25095-0
14/07/28 14:34:00 INFO SecurityManager: Changing view acls to: amilkowski
14/07/28 14:34:00 INFO SecurityManager: SecurityManager: authentication
disabled; ui acls disabled; users with view permissions: Set(amilkowski)
14/07/28 14:34:00 INFO Slf4jLogger: Slf4jLogger started
14/07/28 14:34:00 INFO Remoting: Starting remoting
14/07/28 14:34:01 INFO Remoting: Remoting started; listening on addresses
:[akka.tcp://spark@localhost:40412]
14/07/28 14:34:01 INFO Remoting: Remoting now listens on addresses:
[akka.tcp://spark@localhost:40412]
14/07/28 14:34:01 INFO SparkEnv: Connecting to MapOutputTracker:
akka.tcp://spark@localhost:7077/user/MapOutputTracker
akka.actor.ActorNotFound: Actor not found for:
ActorSelection[Actor[akka.tcp://spark@localhost
:7077/]/user/MapOutputTracker]
at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:66)
at
akka.actor.ActorSelection$$anonfun$resolveOne$1.apply(ActorSelection.scala:64)
at scala.concurrent.impl.CallbackRunnable.run(Promise.scala:32)
at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.processBatch$1(BatchingExecutor.scala:67)
at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply$mcV$sp(BatchingExecutor.scala:82)
at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at
akka.dispatch.BatchingExecutor$Batch$$anonfun$run$1.apply(BatchingExecutor.scala:59)
at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:72)
at akka.dispatch.BatchingExecutor$Batch.run(BatchingExecutor.scala:58)
at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.unbatchedExecute(Future.scala:74)
at akka.dispatch.BatchingExecutor$class.execute(BatchingExecutor.scala:110)
at
akka.dispatch.ExecutionContexts$sameThreadExecutionContext$.execute(Future.scala:73)
at scala.concurrent.impl.CallbackRunnable.executeWithValue(Promise.scala:40)
at
scala.concurrent.impl.Promise$DefaultPromise.tryComplete(Promise.scala:248)
at akka.pattern.PromiseActorRef.$bang(AskSupport.scala:269)
at akka.actor.EmptyLocalActorRef.specialHandle(ActorRef.scala:512)
at akka.actor.DeadLetterActorRef.specialHandle(ActorRef.scala:545)
at akka.actor.DeadLetterActorRef.$bang(ActorRef.scala:535)
at
akka.remote.RemoteActorRefProvider$RemoteDeadLetterActorRef.$bang(RemoteActorRefProvider.scala:91)
at akka.actor.ActorRef.tell(ActorRef.scala:125)
at akka.dispatch.Mailboxes$$anon$1$$anon$2.enqueue(Mailboxes.scala:44)
at akka.dispatch.QueueBasedMessageQueue$class.cleanUp(Mailbox.scala:438)
at
akka.dispatch.UnboundedDequeBasedMailbox$MessageQueue.cleanUp(Mailbox.scala:650)
at akka.dispatch.Mailbox.cleanUp(Mailbox.scala:309)
at akka.dispatch.MessageDispatcher.unregister(AbstractDispatcher.scala:204)
at akka.dispatch.MessageDispatcher.detach(AbstractDispatcher.scala:140)
at
akka.actor.dungeon.FaultHandling$class.akka$actor$dungeon$FaultHandling$$finishTerminate(FaultHandling.scala:203)
at akka.actor.dungeon.FaultHandling$class.terminate(FaultHandling.scala:163)
at akka.actor.ActorCell.terminate(ActorCell.scala:338)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:431)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:447)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:262)
at akka.dispatch.Mailbox.run(Mailbox.scala:218)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:386)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Exception in thread "Thread-2" I0728 14:34:01.269739 19805 exec.cpp:412]
Deactivating the executor libprocess

Re: akka.tcp://spark@localhost:7077/user/MapOutputTracker akka.actor.ActorNotFound

Posted by Andrew Milkowski <am...@gmail.com>.
Dear community never mind! although I was using 1.0.0 spark everywhere I did
not update my spark client

changed pom (from 0.9.0) to 1.0.0

 <spark-core.version>1.0.0-cdh5.1.0</spark-core.version>
 <spark-streaming.version>1.0.0-cdh5.1.0</spark-streaming.version>

fixed the problem




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/akka-tcp-spark-localhost-7077-user-MapOutputTracker-akka-actor-ActorNotFound-tp10794p10813.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.