You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:43:15 UTC

[jira] [Resolved] (SPARK-24189) Spark Strcutured Streaming not working with the Kafka Transactions

     [ https://issues.apache.org/jira/browse/SPARK-24189?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-24189.
----------------------------------
    Resolution: Incomplete

> Spark Strcutured Streaming not working with the Kafka Transactions
> ------------------------------------------------------------------
>
>                 Key: SPARK-24189
>                 URL: https://issues.apache.org/jira/browse/SPARK-24189
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: bharath kumar avusherla
>            Priority: Major
>              Labels: bulk-closed
>
> Was trying to read kafka transactional topic using Spark Structured Streaming 2.3.0 with the  kafka option isolation-level = "read_committed", but spark reading the data immediately without waiting for the data in topic to be committed. In spark documentation it was mentioned as Structured Streaming supports Kafka version 0.10 or higher. I am using below command to test the scenario.
> val df = spark
>  .readStream
>  .format("kafka")
>  .option("kafka.bootstrap.servers", "localhost:9092")
>  .option("subscribe", "test-topic")
>  .option("isolation-level","read_committed")
>  .load()
> Can you please let me know if the transactional read is supported in SPark 2.3.0 strcutured Streaming or am i missing anything.
>  
> Thank you.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org