You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Alberto Garcia <ag...@inf.uc3m.es> on 2014/12/11 14:38:31 UTC

Standalone app: IOException due to broadcast.destroy()

Hello.

I'm pretty new with Spark
I am developing an Spark application, conducting the test on local prior to
deploy it on a cluster. I have a problem with a broacast variable. The
application raises 

"Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task serialization failed: java.io.IOException: unexpected
exception type" 
java.io.ObjectStreamClass.throwMiscException(ObjectStreamClass.java:1538)
java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:994)
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1496)
...
...
if I try to destroy that broacast (tried both synchronous and asynchronous
modes).

The application follows an iterative pattern:

-Create accumulator

-for i=0....1000
       -foreach() --> add to the accumulator
       -broacast(accumulator.value())
       -.... 
       -....use broadcasted value
       -....
       -broacast.destroy()
       -accumulator.setValue(zeroValue)
endfor

The exception raises in i=1 in foreach line. If I comment the destruction
line, the application reaches more than 200 iterations before failing (but
that's another issue).

Where could be my mistake?

Spark 1.1.0
Hadoop 2.2.0
java version "1.8.0_25"
Java(TM) SE Runtime Environment (build 1.8.0_25-b17)
Java HotSpot(TM) 64-Bit Server VM (build 25.25-b02, mixed mode)

If you need more info please ask me.

Thank for the help.




--
View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Standalone-app-IOException-due-to-broadcast-destroy-tp20627.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscribe@spark.apache.org
For additional commands, e-mail: user-help@spark.apache.org