You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Michael Allman <mi...@videoamp.com> on 2016/07/02 18:10:50 UTC

Can't build scala unidoc since Kafka 0.10 support was added

Hello,

I'm no longer able to successfully run `sbt unidoc` in branch-2.0, and the problem seems to stem from the addition of Kafka 0.10 support. If I remove either the Kafka 0.8 or 0.10 projects from the build then unidoc works. If I keep both in I get two dozen inexplicable compilation errors as part of the unidoc task execution. Here's the first few:

[error] /Users/msa/workspace/spark-2.0/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/CachedKafkaConsumer.scala:50: value assign is not a member of org.apache.kafka.clients.consumer.KafkaConsumer[K,V]
[error]     c.assign(tps)
[error]       ^
[error] /Users/msa/workspace/spark-2.0/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/CachedKafkaConsumer.scala:95: too many arguments for method seek: (x$1: java.util.Map[org.apache.kafka.common.TopicPartition,Long])Unit
[error]     consumer.seek(topicPartition, offset)
[error]                  ^
[error] /Users/msa/workspace/spark-2.0/external/kafka-0-10/src/main/scala/org/apache/spark/streaming/kafka010/CachedKafkaConsumer.scala:100: value records is not a member of java.util.Map[String,org.apache.kafka.clients.consumer.ConsumerRecords[K,V]]
[error]     val r = p.records(topicPartition)

Running `sbt compile` completes without error.

Has anyone else seen this behavior? Any ideas? This seems to be an issue around dependency management, but I'm otherwise stumped.

Cheers,

Michael
---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscribe@spark.apache.org