You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Abhijeet Prasad (Jira)" <ji...@apache.org> on 2019/12/03 01:14:00 UTC

[jira] [Created] (SPARK-30105) Add StreamingQueryListener support to PySpark

Abhijeet Prasad created SPARK-30105:
---------------------------------------

             Summary: Add StreamingQueryListener support to PySpark
                 Key: SPARK-30105
                 URL: https://issues.apache.org/jira/browse/SPARK-30105
             Project: Spark
          Issue Type: New Feature
          Components: PySpark, Structured Streaming
    Affects Versions: 3.0.0
            Reporter: Abhijeet Prasad


Add support for StreamingQueryListener to PySpark.

 

Currently the `StreamingQueryListener` in Scala is implemented as an abstract class, so we cannot use Python proxies (Py4j) to access it unless we create our own custom Scala/Java wrapper.

 

[https://github.com/apache/spark/blob/master/sql/core/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryListener.scala]

 

This would be very useful in my personal case, I am building a library that allows you to send Python errors to Sentry.io [https://docs.sentry.io/platforms/python/pyspark/] and would like to hook onto onQueryTerminated to send errors.

 

I can take this on if you point me in which direction to go, new to the codebase so not quite sure what the process for porting Scala API -> PySpark API changes usually look like.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org