You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by soojin <xa...@yahoo.com> on 2014/02/04 18:22:09 UTC

Spark Streaming StreamingContext error

Hi,

I am having the following error when building against spark-core 0.8.1
and spark-streaming 0.8.1.

libraryDependencies ++= Seq(
   "org.apache.spark" %% "spark-core" % "0.8.1-incubating",
   "org.apache.spark" %% "spark-streaming" % "0.8.1-incubating"
   )

I am using Scala 2.9.3.


Please help. Thank you for your help.

14/02/04 11:56:39 INFO server.Server: jetty-7.x.y-SNAPSHOT
14/02/04 11:56:39 INFO server.AbstractConnector: Started
SocketConnector@0.0.0.0:59615
14/02/04 11:56:39 ERROR metrics.MetricsSystem: Sink class
org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
java.lang.reflect.InvocationTargetException
	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
	at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
	at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
	at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
	at
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:134)
	at
org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:129)
	at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
	at
scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
	at scala.collection.Iterator$class.foreach(Iterator.scala:772)
	at scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
	at
scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)
	at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45)
	at scala.collection.mutable.HashMap.foreach(HashMap.scala:95)
	at
org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:129)
	at org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:83)
	at
org.apache.spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:203)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:105)
	at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:548)
	at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
	at 
com.test.streaming.StreamingTest.createContext(StreamingTest.scala:24)
Caused by: java.lang.NoSuchMethodError:
com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V
	at
com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:213)
	at
com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
	at
org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:44)
	... 22 more
Exception in thread "main" java.lang.ClassCastException: [Ljava.lang.Object;
cannot be cast to [Lscala.Tuple2;
	at
org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:79)
	at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:49)
	at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
	at
org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:548)
	at
org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
        at
com.test.streaming.StreamingTest.createContext(StreamingTest.scala:24)



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-StreamingContext-error-tp1191.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Re: Spark Streaming StreamingContext error

Posted by Tathagata Das <ta...@gmail.com>.
Seems like it is not able to find a particular class -
org.apache.spark.metrics.sink.MetricsServlet .
How are you running your program? Is this an intermittent error? Does it go
away if you do a clean compilation of your project and run again?

TD


On Tue, Feb 4, 2014 at 9:22 AM, soojin <xa...@yahoo.com> wrote:

> Hi,
>
> I am having the following error when building against spark-core 0.8.1
> and spark-streaming 0.8.1.
>
> libraryDependencies ++= Seq(
>    "org.apache.spark" %% "spark-core" % "0.8.1-incubating",
>    "org.apache.spark" %% "spark-streaming" % "0.8.1-incubating"
>    )
>
> I am using Scala 2.9.3.
>
>
> Please help. Thank you for your help.
>
> 14/02/04 11:56:39 INFO server.Server: jetty-7.x.y-SNAPSHOT
> 14/02/04 11:56:39 INFO server.AbstractConnector: Started
> SocketConnector@0.0.0.0:59615
> 14/02/04 11:56:39 ERROR metrics.MetricsSystem: Sink class
> org.apache.spark.metrics.sink.MetricsServlet cannot be instantialized
> java.lang.reflect.InvocationTargetException
>         at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
> Method)
>         at
>
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>         at
>
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>         at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>         at
>
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:134)
>         at
>
> org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:129)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
>         at
> scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:95)
>         at scala.collection.Iterator$class.foreach(Iterator.scala:772)
>         at
> scala.collection.mutable.HashTable$$anon$1.foreach(HashTable.scala:157)
>         at
> scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:190)
>         at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:45)
>         at scala.collection.mutable.HashMap.foreach(HashMap.scala:95)
>         at
>
> org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:129)
>         at
> org.apache.spark.metrics.MetricsSystem.<init>(MetricsSystem.scala:83)
>         at
> org.apache.spark.SparkEnv$.createFromSystemProperties(SparkEnv.scala:203)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:105)
>         at
>
> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:548)
>         at
>
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
>         at
> com.test.streaming.StreamingTest.createContext(StreamingTest.scala:24)
> Caused by: java.lang.NoSuchMethodError:
>
> com.fasterxml.jackson.databind.module.SimpleSerializers.<init>(Ljava/util/List;)V
>         at
> com.codahale.metrics.json.MetricsModule.setupModule(MetricsModule.java:213)
>         at
>
> com.fasterxml.jackson.databind.ObjectMapper.registerModule(ObjectMapper.java:469)
>         at
>
> org.apache.spark.metrics.sink.MetricsServlet.<init>(MetricsServlet.scala:44)
>         ... 22 more
> Exception in thread "main" java.lang.ClassCastException:
> [Ljava.lang.Object;
> cannot be cast to [Lscala.Tuple2;
>         at
>
> org.apache.spark.metrics.MetricsSystem.getServletHandlers(MetricsSystem.scala:79)
>         at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:49)
>         at org.apache.spark.SparkContext.<init>(SparkContext.scala:122)
>         at
>
> org.apache.spark.streaming.StreamingContext$.createNewSparkContext(StreamingContext.scala:548)
>         at
>
> org.apache.spark.streaming.StreamingContext.<init>(StreamingContext.scala:84)
>         at
> com.test.streaming.StreamingTest.createContext(StreamingTest.scala:24)
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Streaming-StreamingContext-error-tp1191.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>