You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by gtanguy <g....@gmail.com> on 2015/10/08 10:54:01 UTC
Spark ganglia jClassNotFoundException:
org.apache.spark.metrics.sink.GangliaSink
I build spark with ganglia :
$SPARK_HOME/build/sbt -Pspark-ganglia-lgpl -Phadoop-1 -Phive
-Phive-thriftserver assembly
...
[info] Including from cache: metrics-ganglia-3.1.0.jar
...
In the master log :
ERROR actor.OneForOneStrategy: org.apache.spark.metrics.sink.GangliaSink
akka.actor.ActorInitializationException: exception during creation
at akka.actor.ActorInitializationException$.apply(Actor.scala:164)
at akka.actor.ActorCell.create(ActorCell.scala:596)
at akka.actor.ActorCell.invokeAll$1(ActorCell.scala:456)
at akka.actor.ActorCell.systemInvoke(ActorCell.scala:478)
at akka.dispatch.Mailbox.processAllSystemMessages(Mailbox.scala:263)
at akka.dispatch.Mailbox.run(Mailbox.scala:219)
at
akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractDispatcher.scala:393)
at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at
scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.ClassNotFoundException:
org.apache.spark.metrics.sink.GangliaSink
Did I forget something?
I am on spark 1.3.1.
My metrics.properties :
*.sink.ganglia.class=org.apache.spark.metrics.sink.GangliaSink
*.sink.ganglia.host=ip-10-137-120-185.ec2.internal
*.sink.ganglia.port=5080
*.sink.ganglia.period=10
*.sink.ganglia.unit=seconds
*.sink.ganglia.ttl=1
*.sink.ganglia.mode=multicast
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-ganglia-jClassNotFoundException-org-apache-spark-metrics-sink-GangliaSink-tp24977.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.
---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org