You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by jfowkes <ma...@gmail.com> on 2014/07/15 11:24:59 UTC
Kryo NoSuchMethodError on Spark 1.0.0 standalone
Hi there,
I've been sucessfully using the precompiled Spark 1.0.0 Java api on a small
cluster in standalone mode. However, when I try to use Kryo serializer by
adding
conf.set("spark.serializer","org.apache.spark.serializer.KryoSerializer");
as suggested, Spark crashes out with the following error:
Exception in thread "main" java.lang.NoSuchMethodError:
com.esotericsoftware.kryo.Kryo.setInstantiatorStrategy(Lorg/objenesis/strategy/InstantiatorStrategy;)V
at com.twitter.chill.KryoBase.setInstantiatorStrategy(KryoBase.scala:85)
at
com.twitter.chill.EmptyScalaKryoInstantiator.newKryo(ScalaKryoInstantiator.scala:57)
at
org.apache.spark.serializer.KryoSerializer.newKryo(KryoSerializer.scala:56)
at
org.apache.spark.serializer.KryoSerializerInstance.<init>(KryoSerializer.scala:130)
at
org.apache.spark.serializer.KryoSerializer.newInstance(KryoSerializer.scala:92)
at org.apache.spark.broadcast.HttpBroadcast$.write(HttpBroadcast.scala:172)
at org.apache.spark.broadcast.HttpBroadcast.<init>(HttpBroadcast.scala:57)
at
org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:35)
at
org.apache.spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcastFactory.scala:29)
at
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
at org.apache.spark.SparkContext.broadcast(SparkContext.scala:776)
at org.apache.spark.SparkContext.hadoopFile(SparkContext.scala:545)
at org.apache.spark.SparkContext.textFile(SparkContext.scala:457)
at
org.apache.spark.api.java.JavaSparkContext.textFile(JavaSparkContext.scala:171)
Yet Kryo is very much present in the spark-assembly jar. I'm very confused
by this...
Regards,
Jari
--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Kryo-NoSuchMethodError-on-Spark-1-0-0-standalone-tp9746.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.