You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by rapelly kartheek <ka...@gmail.com> on 2014/12/02 07:29:39 UTC

java.io.IOException: Filesystem closed

Hi,

I face the following exception when submit a spark application. The log
file shows:

14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
threw an exception
java.io.IOException: Filesystem closed
at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
at
org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
at org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
at
org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
at
org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
at scala.Option.foreach(Option.scala:236)
at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
at
org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
at
org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
at
org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
at
org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
at
org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
at scala.Option.foreach(Option.scala:236)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
at
org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)

Someone please help me resolve this!!

Thanks

Re: java.io.IOException: Filesystem closed

Posted by "Kartheek.R" <ka...@gmail.com>.
Are you replicating any RDDs?



--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/java-io-IOException-Filesystem-closed-tp20150p21749.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org


Re: java.io.IOException: Filesystem closed

Posted by rapelly kartheek <ka...@gmail.com>.
Does the sparkContext shuts down itself by default even if I dont mention
specifically in my code?? Because, I ran the application without
sc.context(), still I get file system closed error along with correct
output.

On Tue, Dec 2, 2014 at 2:20 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> It could be because those threads are finishing quickly.
>
> Thanks
> Best Regards
>
> On Tue, Dec 2, 2014 at 2:19 PM, rapelly kartheek <ka...@gmail.com>
> wrote:
>
>> But, somehow, if I run this application for the second time, I find that
>> the application gets executed and the results are out regardless of the
>> same errors in logs.
>>
>> On Tue, Dec 2, 2014 at 2:08 PM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> Your code seems to have a lot of threads and i think you might be
>>> invoking sc.stop before those threads get finished.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
>>> wrote:
>>>
>>>> What is the application that you are submitting? Looks like you might
>>>> have invoked fs inside the app and then closed it within it.
>>>>
>>>> Thanks
>>>> Best Regards
>>>>
>>>> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <
>>>> kartheek.mbms@gmail.com> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> I face the following exception when submit a spark application. The
>>>>> log file shows:
>>>>>
>>>>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>>>>> threw an exception
>>>>> java.io.IOException: Filesystem closed
>>>>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>>>>> at
>>>>> org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>>>>> at
>>>>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>>>>> at
>>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>>> at
>>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>>> at scala.Option.foreach(Option.scala:236)
>>>>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>>>>> at
>>>>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>>>>> at
>>>>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>>>>> at
>>>>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>>>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>>>>> at
>>>>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>>> at scala.Option.foreach(Option.scala:236)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>>>>> at
>>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>>>>
>>>>> Someone please help me resolve this!!
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>
>>>
>>
>

Re: java.io.IOException: Filesystem closed

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
It could be because those threads are finishing quickly.

Thanks
Best Regards

On Tue, Dec 2, 2014 at 2:19 PM, rapelly kartheek <ka...@gmail.com>
wrote:

> But, somehow, if I run this application for the second time, I find that
> the application gets executed and the results are out regardless of the
> same errors in logs.
>
> On Tue, Dec 2, 2014 at 2:08 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> Your code seems to have a lot of threads and i think you might be
>> invoking sc.stop before those threads get finished.
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
>> wrote:
>>
>>> What is the application that you are submitting? Looks like you might
>>> have invoked fs inside the app and then closed it within it.
>>>
>>> Thanks
>>> Best Regards
>>>
>>> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <
>>> kartheek.mbms@gmail.com> wrote:
>>>
>>>> Hi,
>>>>
>>>> I face the following exception when submit a spark application. The log
>>>> file shows:
>>>>
>>>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>>>> threw an exception
>>>> java.io.IOException: Filesystem closed
>>>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>>>> at
>>>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>>>> at
>>>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>>>> at
>>>> org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>>>> at
>>>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>>>> at
>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>> at
>>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>>> at scala.Option.foreach(Option.scala:236)
>>>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>>>> at
>>>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>>>> at
>>>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>>>> at
>>>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>>>> at
>>>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>>> at scala.Option.foreach(Option.scala:236)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>>>> at
>>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>>>
>>>> Someone please help me resolve this!!
>>>>
>>>> Thanks
>>>>
>>>>
>>>
>>
>

Re: java.io.IOException: Filesystem closed

Posted by rapelly kartheek <ka...@gmail.com>.
But, somehow, if I run this application for the second time, I find that
the application gets executed and the results are out regardless of the
same errors in logs.

On Tue, Dec 2, 2014 at 2:08 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> Your code seems to have a lot of threads and i think you might be invoking
> sc.stop before those threads get finished.
>
> Thanks
> Best Regards
>
> On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
> wrote:
>
>> What is the application that you are submitting? Looks like you might
>> have invoked fs inside the app and then closed it within it.
>>
>> Thanks
>> Best Regards
>>
>> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <
>> kartheek.mbms@gmail.com> wrote:
>>
>>> Hi,
>>>
>>> I face the following exception when submit a spark application. The log
>>> file shows:
>>>
>>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>>> threw an exception
>>> java.io.IOException: Filesystem closed
>>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>>> at
>>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>>> at
>>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>>> at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>>> at
>>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>>> at
>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>> at
>>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>>> at scala.Option.foreach(Option.scala:236)
>>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>>> at
>>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>>> at
>>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>>> at
>>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>>> at
>>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>>> at scala.Option.foreach(Option.scala:236)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>>> at
>>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>>
>>> Someone please help me resolve this!!
>>>
>>> Thanks
>>>
>>>
>>
>

Re: java.io.IOException: Filesystem closed

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
Your code seems to have a lot of threads and i think you might be invoking
sc.stop before those threads get finished.

Thanks
Best Regards

On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> What is the application that you are submitting? Looks like you might have
> invoked fs inside the app and then closed it within it.
>
> Thanks
> Best Regards
>
> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <kartheek.mbms@gmail.com
> > wrote:
>
>> Hi,
>>
>> I face the following exception when submit a spark application. The log
>> file shows:
>>
>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>> threw an exception
>> java.io.IOException: Filesystem closed
>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>> at
>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>> at
>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>> at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>> at
>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>> at
>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>> at
>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>> at scala.Option.foreach(Option.scala:236)
>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>> at
>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>> at
>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>
>> Someone please help me resolve this!!
>>
>> Thanks
>>
>>
>

Re: java.io.IOException: Filesystem closed

Posted by rapelly kartheek <ka...@gmail.com>.
Sorry for the delayed response. Please find my application attached.

On Tue, Dec 2, 2014 at 12:04 PM, Akhil Das <ak...@sigmoidanalytics.com>
wrote:

> What is the application that you are submitting? Looks like you might have
> invoked fs inside the app and then closed it within it.
>
> Thanks
> Best Regards
>
> On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <kartheek.mbms@gmail.com
> > wrote:
>
>> Hi,
>>
>> I face the following exception when submit a spark application. The log
>> file shows:
>>
>> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
>> threw an exception
>> java.io.IOException: Filesystem closed
>> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
>> at
>> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
>> at
>> org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
>> at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
>> at
>> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
>> at
>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>> at
>> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
>> at scala.Option.foreach(Option.scala:236)
>> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
>> at
>> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
>> at
>> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
>> at
>> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
>> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
>> at
>> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
>> at
>> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
>> at scala.Option.foreach(Option.scala:236)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
>> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
>> at
>> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>>
>> Someone please help me resolve this!!
>>
>> Thanks
>>
>>
>

Re: java.io.IOException: Filesystem closed

Posted by Akhil Das <ak...@sigmoidanalytics.com>.
What is the application that you are submitting? Looks like you might have
invoked fs inside the app and then closed it within it.

Thanks
Best Regards

On Tue, Dec 2, 2014 at 11:59 AM, rapelly kartheek <ka...@gmail.com>
wrote:

> Hi,
>
> I face the following exception when submit a spark application. The log
> file shows:
>
> 14/12/02 11:52:58 ERROR LiveListenerBus: Listener EventLoggingListener
> threw an exception
> java.io.IOException: Filesystem closed
> at org.apache.hadoop.hdfs.DFSClient.checkOpen(DFSClient.java:689)
> at
> org.apache.hadoop.hdfs.DFSOutputStream.flushOrSync(DFSOutputStream.java:1668)
> at org.apache.hadoop.hdfs.DFSOutputStream.hflush(DFSOutputStream.java:1629)
> at org.apache.hadoop.hdfs.DFSOutputStream.sync(DFSOutputStream.java:1614)
> at
> org.apache.hadoop.fs.FSDataOutputStream.sync(FSDataOutputStream.java:120)
> at
> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
> at
> org.apache.spark.util.FileLogger$$anonfun$flush$2.apply(FileLogger.scala:158)
> at scala.Option.foreach(Option.scala:236)
> at org.apache.spark.util.FileLogger.flush(FileLogger.scala:158)
> at
> org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:87)
> at
> org.apache.spark.scheduler.EventLoggingListener.onJobEnd(EventLoggingListener.scala:112)
> at
> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
> at
> org.apache.spark.scheduler.SparkListenerBus$$anonfun$postToAll$4.apply(SparkListenerBus.scala:52)
> at
> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:81)
> at
> org.apache.spark.scheduler.SparkListenerBus$$anonfun$foreachListener$1.apply(SparkListenerBus.scala:79)
> at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> at
> org.apache.spark.scheduler.SparkListenerBus$class.foreachListener(SparkListenerBus.scala:79)
> at
> org.apache.spark.scheduler.SparkListenerBus$class.postToAll(SparkListenerBus.scala:52)
> at
> org.apache.spark.scheduler.LiveListenerBus.postToAll(LiveListenerBus.scala:32)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1$$anonfun$apply$mcV$sp$1.apply(LiveListenerBus.scala:56)
> at scala.Option.foreach(Option.scala:236)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply$mcV$sp(LiveListenerBus.scala:56)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1$$anonfun$run$1.apply(LiveListenerBus.scala:47)
> at org.apache.spark.util.Utils$.logUncaughtExceptions(Utils.scala:1160)
> at
> org.apache.spark.scheduler.LiveListenerBus$$anon$1.run(LiveListenerBus.scala:46)
>
> Someone please help me resolve this!!
>
> Thanks
>
>