You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Parhy (JIRA)" <ji...@apache.org> on 2018/10/17 15:47:00 UTC

[jira] [Updated] (SPARK-25759) StreamingListenerBus: Listener JavaStreamingListenerWrapper threw an exception

     [ https://issues.apache.org/jira/browse/SPARK-25759?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Parhy updated SPARK-25759:
--------------------------
    Affects Version/s:     (was: 2.2.0)
                       2.2.1

> StreamingListenerBus: Listener JavaStreamingListenerWrapper threw an exception
> ------------------------------------------------------------------------------
>
>                 Key: SPARK-25759
>                 URL: https://issues.apache.org/jira/browse/SPARK-25759
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.1
>         Environment: Any
>            Reporter: Parhy
>            Priority: Blocker
>
> I am using pyspark with spark Streaming version 2.2.1 . I am using AWS s3 as my source.
> Once the batch is complete. I would like to remove the files from S3. I have extended the StreamingListener class. I can see the listener is called once the batch is complete. But I am getting an exception as well. I am putting the exception. I did see a stackoverflow question with the exact same question but no solution.
> Kindly help here.
> Below is the exception. 
>  
> ERROR StreamingListenerBus: Listener JavaStreamingListenerWrapper threw an exception
> py4j.Py4JException: An exception was raised by the Python Proxy. Return Message: x
>  at py4j.Protocol.getReturnValue(Protocol.java:438)
>  at py4j.reflection.PythonProxyHandler.invoke(PythonProxyHandler.java:105)
>  at com.sun.proxy.$Proxy10.onBatchCompleted(Unknown Source)
>  at org.apache.spark.streaming.api.java.PythonStreamingListenerWrapper.onBatchCompleted(JavaStreamingListener.scala:89)
>  at org.apache.spark.streaming.api.java.JavaStreamingListenerWrapper.onBatchCompleted(JavaStreamingListenerWrapper.scala:111)
>  at org.apache.spark.streaming.scheduler.StreamingListenerBus.doPostEvent(StreamingListenerBus.scala:63)
>  at org.apache.spark.streaming.scheduler.StreamingListenerBus.doPostEvent(StreamingListenerBus.scala:29)
>  at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
>  at org.apache.spark.streaming.scheduler.StreamingListenerBus.postToAll(StreamingListenerBus.scala:29)
>  at org.apache.spark.streaming.scheduler.StreamingListenerBus.onOtherEvent(StreamingListenerBus.scala:43)
>  at org.apache.spark.scheduler.SparkListenerBus$class.doPostEvent(SparkListenerBus.scala:75)
>  at org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>  at org.apache.spark.scheduler.LiveListenerBus.doPostEvent(LiveListenerBus.scala:36)
>  at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:63)
>  at org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:36)
>  at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply$mcV$sp(LiveListenerBus.scala:94)
>  at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>  at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:79)
>  at scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)
>  at org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:78)
>  at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1279)
>  at org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:77)
>  
> Below is the link for SO
> https://stackoverflow.com/questions/47780794/py4jexception-using-pyspark-streaminglistener/52858375#52858375



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org