You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Sean Owen <so...@cloudera.com> on 2014/05/28 00:00:42 UTC

Kafka + Spark Streaming and NoSuchMethodError, related to Manifest / reflection?

I'd like to resurrect this thread:
http://mail-archives.apache.org/mod_mbox/spark-user/201403.mbox/%3C6D657D19-1ECF-4E92-BF15-CC4762EF98BF@thekratos.com%3E

Basically when you call this particular Java-flavored overloading of
KafkaUtils.createStream:
https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala#L133

... you get

java.lang.NoSuchMethodException:
java.lang.Object.<init>(kafka.utils.VerifiableProperties)
        at java.lang.Class.getConstructor0(Class.java:2763)
        at java.lang.Class.getConstructor(Class.java:1693)
        at org.apache.spark.streaming.kafka.KafkaReceiver.onStart(KafkaInputDStream.scala:108)

This doesn't appear to be a version issue. It doesn't appear when
calling other versions of this method. Other overloadings work (well,
have other issues).

Something is making it try to instantiate java.lang.Object as if it's
a Decoder class.

I am wondering about this code at
https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/KafkaUtils.scala#L148

    implicit val keyCmd: Manifest[U] =
implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[U]]
    implicit val valueCmd: Manifest[T] =
implicitly[Manifest[AnyRef]].asInstanceOf[Manifest[T]]

... where U and T are key/value Decoder types. I don't know enough
Scala to fully understand this, but is it possible this causes the
reflective call later to lose the type and try to instantiate Object?
The AnyRef made me wonder.

@tdas I'm hoping you might have some insight as it came in this commit
in January:
https://github.com/apache/spark/commit/aa99f226a691ddcb4442d60f4cd4908f434cc4ce

I'll file a JIRA if it's legitimate; just asking first.