You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by 王天鹏 <wa...@meiqia.com> on 2016/07/15 07:46:56 UTC

ERROR Processor got uncaught exception. (kafka.network.Processor) with java.lang.ArrayIndexOutOfBoundsException

Hello, today I my kafka producer stop working, here is the story:
Spark: v1.6.2
Kafka broker: v0.9.0.1
Kafka client: tried 0.9.x and 0.10.x

In a spark batch job, I created a kafka producer and tried to send some
messages to kafka broker, it works without any problem for > one month
until today :(

My spark job just hangs, and after wait some time, the broker logs follwong
exception:

ERROR Processor got uncaught exception. (kafka.network.Processor)
java.lang.ArrayIndexOutOfBoundsException: 18
at org.apache.kafka.common.protocol.ApiKeys.forId(ApiKeys.java:68)
at
org.apache.kafka.common.requests.AbstractRequest.getRequest(AbstractRequest.java:39)
at kafka.network.RequestChannel$Request.<init>(RequestChannel.scala:79)
at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:426)
at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:421)
at scala.collection.Iterator$class.foreach(Iterator.scala:742)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
at kafka.network.Processor.run(SocketServer.scala:421)
at java.lang.Thread.run(Thread.java:745)

I tried switching clients, using another topic, remove all consumer groups
(in zk), with no luck.

Someone tells me maybe it's related to this bug
https://issues.apache.org/jira/browse/KAFKA-3547, is it true?

And if I upgrade my broker to v0.10.0.0, will it provide backward
compatibility with my old clients (v0.9.0.1) clients? What about kafka
streams used in spark?

Thanks in advance!

Re: ERROR Processor got uncaught exception. (kafka.network.Processor) with java.lang.ArrayIndexOutOfBoundsException

Posted by 王天鹏 <wa...@meiqia.com>.
Sorry, please ignore my stupidity, I found it's due to empty topic name,
the topic name was reset to empty string at some point in the application
code.


2016-07-15 15:46 GMT+08:00 王天鹏 <wa...@meiqia.com>:

> Hello, today I my kafka producer stop working, here is the story:
> Spark: v1.6.2
> Kafka broker: v0.9.0.1
> Kafka client: tried 0.9.x and 0.10.x
>
> In a spark batch job, I created a kafka producer and tried to send some
> messages to kafka broker, it works without any problem for > one month
> until today :(
>
> My spark job just hangs, and after wait some time, the broker logs
> follwong exception:
>
> ERROR Processor got uncaught exception. (kafka.network.Processor)
> java.lang.ArrayIndexOutOfBoundsException: 18
> at org.apache.kafka.common.protocol.ApiKeys.forId(ApiKeys.java:68)
> at
> org.apache.kafka.common.requests.AbstractRequest.getRequest(AbstractRequest.java:39)
> at kafka.network.RequestChannel$Request.<init>(RequestChannel.scala:79)
> at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:426)
> at kafka.network.Processor$$anonfun$run$11.apply(SocketServer.scala:421)
> at scala.collection.Iterator$class.foreach(Iterator.scala:742)
> at scala.collection.AbstractIterator.foreach(Iterator.scala:1194)
> at scala.collection.IterableLike$class.foreach(IterableLike.scala:72)
> at scala.collection.AbstractIterable.foreach(Iterable.scala:54)
> at kafka.network.Processor.run(SocketServer.scala:421)
> at java.lang.Thread.run(Thread.java:745)
>
> I tried switching clients, using another topic, remove all consumer groups
> (in zk), with no luck.
>
> Someone tells me maybe it's related to this bug
> https://issues.apache.org/jira/browse/KAFKA-3547, is it true?
>
> And if I upgrade my broker to v0.10.0.0, will it provide backward
> compatibility with my old clients (v0.9.0.1) clients? What about kafka
> streams used in spark?
>
> Thanks in advance!
>