You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "nimmi.cv" <ni...@gmail.com> on 2020/05/21 21:15:20 UTC

Spark Kafka Streaming With Transactional Messages

I am using Spark 2.4 and using createDstream to read from kafka topic. The
topic has messaged written from a transactional producer. 

I am getting the following error 
"requirement failed: Got wrong record for
spark-executor-FtsTopicConsumerGrp7 test11-1 even after seeking to offset 85
got offset 86 instead. If this is a compacted topic, consider enabling
spark.streaming.kafka.allowNonConsecutiveOffsets"


When i enable  spark.streaming.kafka.allowNonConsecutiveOffsets, I am
getting the following error
java.lang.IllegalArgumentException: requirement failed: Failed to get
records for compacted spark-executor-FtsTopicConsumerGrpTESTING_5
fts.analytics-0 after polling for 10000
            at scala.Predef$.require(Predef.scala:224)

Also I set kafka.isolation.level="read_committed".

Anu help on thisw ill be appreciated.






--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org


Re: Spark Kafka Streaming With Transactional Messages

Posted by jianyangusa <ji...@gmail.com>.
I have the same issue. Do you have a solution? Maybe spark stream not support
transaction message. I use Kafka stream to retrieve the transaction message.
Maybe we can ask Spark support this feature.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscribe@spark.apache.org