You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Grega Kešpret <gr...@celtra.com> on 2014/02/24 21:06:46 UTC
metrics.MetricsSystem: Sink class org.apache.spark.metrics.sink.MetricsServlet
cannot be instantialized
Hi,
I'm seeing below output in the logs when I start a job on driver (with
master = local) when I package driver sources + pre-built spark
spark-core-assembly-v0.8.1-incubating.jar in a far jar with sbt/sbt
assembly. However, when I start with sbt/sbt run, it works fine.
I've tried rm -rf target && rm -rf ~/.sbt && rm -rf ~/.ivy2.
Any ideas?
14/02/24 16:24:48,223 ERROR metrics.MetricsSystem: Sink class
org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:134)
at
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:129)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
at scala.collection.Iterator$class.foreach(Iterator.scala:772)
at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:95)
at
org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:129)
at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:83)
at
org.apache.spark.metrics.MetricsSystem$.createMetricsSystem(MetricsSystem.scala:162)
at org.apache.spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:194)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:105)
at com.celtra.CountingInteractions$.main(CountingInteractions.scala:18)
at com.celtra.CountingInteractions.main(CountingInteractions.scala)
Caused by: java.lang.NoSuchMethodError:
com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V
at
com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:213)
at
com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
at
org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:44)
... 20 more
Exception in thread "main" java.lang.ClassCastException:
[Ljava.lang.Object; cannot be cast to [Lscala.Tuple2;
at
org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:79)
at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:49)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
at com.celtra.CountingInteractions$.main(CountingInteractions.scala:18)
at com.celtra.CountingInteractions.main(CountingInteractions.scala)
Thanks!
Grega