You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jean-Francois Gosselin <jf...@gmail.com> on 2017/07/17 21:05:42 UTC

Spark Streaming handling Kafka exceptions

How can I handle an error with Kafka with my DirectStream (network issue,
zookeeper or broker going down) ? For example when the consumer fails to
connect with Kafka (at startup) I only get a DEBUG log (not even an ERROR)
and no exception are thrown ...

I'm using Spark 2.1.1 and spark-streaming-kafka-0-10.

16:50:23.149 [ForkJoinPool-1-worker-5] DEBUG
o.a.kafka.common.network.Selector - Connection with localhost/127.0.0.1
disconnected
java.net.ConnectException: Connection refused: no further information
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:717)
at
org.apache.kafka.common.network.PlaintextTransportLayer.finishConnect(PlaintextTransportLayer.java:51)
at
org.apache.kafka.common.network.KafkaChannel.finishConnect(KafkaChannel.java:81)
at
org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:335)
at org.apache.kafka.common.network.Selector.poll(Selector.java:303)
at org.apache.kafka.clients.NetworkClient.poll(NetworkClient.java:349)
at
org.apache.kafka.clients.consumer.internals.ConsumerNetworkClient.poll(ConsumerNetworkClient.java:226)


Thanks