You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "JinxinTang (Jira)" <ji...@apache.org> on 2020/08/01 02:24:00 UTC

[jira] [Commented] (SPARK-32500) Query and Batch Id not set for Structured Streaming Jobs in case of ForeachBatch in PySpark

    [ https://issues.apache.org/jira/browse/SPARK-32500?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17169219#comment-17169219 ] 

JinxinTang commented on SPARK-32500:
------------------------------------

Hi, [~abhishekd0907],

I have test the code in master, branch-2.4.6 and brach-3.0.0, seems all can work fine in pyspark as follows:

!image-2020-08-01-10-21-51-246.png!

> Query and Batch Id not set for Structured Streaming Jobs in case of ForeachBatch in PySpark
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-32500
>                 URL: https://issues.apache.org/jira/browse/SPARK-32500
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark, Structured Streaming
>    Affects Versions: 2.4.6
>            Reporter: Abhishek Dixit
>            Priority: Major
>         Attachments: Screen Shot 2020-07-30 at 9.04.21 PM.png, image-2020-08-01-10-21-51-246.png
>
>
> Query Id and Batch Id information is not available for jobs started by structured streaming query when _foreachBatch_ API is used in PySpark.
> This happens only with foreachBatch in pyspark. ForeachBatch in scala works fine, and also other structured streaming sinks in pyspark work fine. I am attaching a screenshot of jobs pages.
> I think job group is not set properly when _foreachBatch_ is used via pyspark. I have a framework that depends on the _queryId_ and _batchId_ information available in the job properties and so my framework doesn't work for pyspark-foreachBatch use case.
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org