You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2015/05/16 13:15:59 UTC

[jira] [Resolved] (SPARK-7523) ERROR LiveListenerBus: Listener EventLoggingListener threw an exception

     [ https://issues.apache.org/jira/browse/SPARK-7523?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-7523.
------------------------------
    Resolution: Invalid

I think this should start as a discussion on the mailing list. It's not clear this is a Spark problem.

> ERROR LiveListenerBus: Listener EventLoggingListener threw an exception
> -----------------------------------------------------------------------
>
>                 Key: SPARK-7523
>                 URL: https://issues.apache.org/jira/browse/SPARK-7523
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 1.3.0
>         Environment: Prod
>            Reporter: sagar
>            Priority: Blocker
>         Attachments: schema.txt, spark-0.0.1-SNAPSHOT.jar
>
>
> Hi Team,
> I am using CDH 5.4 with spark 1.3.0.
> I am getting below error while executing below command -
> I see jira's (SPARK-2906/SPARK-1407) specifying the issue is resolved, but i didnt get any solution what the fix for that. Can you pls guide/suggest as this is production issue.
> $ spark-submit   --master local[4]   --class org.sample.spark.SparkFilter   --name "Spark Sample Program"   spark-0.0.1-SNAPSHOT.jar  /user/user1/schema.txt
> ==================================
> 15/05/11 06:28:36 ERROR LiveListenerBus: Listener EventLoggingListener threw an exception
> java.lang.reflect.InvocationTargetException
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:606)
> 	at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:144)
> 	at org.apache.spark.scheduler.EventLoggingListener$$anonfun$logEvent$3.apply(EventLoggingListener.scala:144)
> 	at scala.Option.foreach(Option.scala:236)
> 	at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:144)
> 	at org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:169)
> 	at org.apache.spark.scheduler.SparkListenerBus$class.onPostEvent(SparkListenerBus.scala:36)
> 	at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
> 	at org.apache.spark.scheduler.LiveListenerBus.onPostEvent(LiveListenerBus.scala:31)
> 	at org.apache.spark.util.ListenerBus$class.postToAll(ListenerBus.scala:53)
> 	at org.apache.spark.util.AsynchronousListenerBus.postToAll(AsynchronousListenerBus.scala:36)
> 	at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(AsynchronousListenerBus.scala:76)
> 	at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply(AsynchronousListenerBus.scala:61)
> 	at org.apache.spark.util.AsynchronousListenerBus$$anon$1$$anonfun$run$1.apply(AsynchronousListenerBus.scala:61)
> 	at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1617)
> 	at org.apache.spark.util.AsynchronousListenerBus$$anon$1.run(AsynchronousListenerBus.scala:60)
> Caused by: java.io.IOException: Filesystem closed
> 	at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:792)
> 	at org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1998)
> 	at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1959)
> 	at org.apache.hadoop.fs.FSDataOutputStream.hflush(FSDataOutputStream.java:130)
> 	... 19 more
> ==================================



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org