You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Koert Kuipers <ko...@tresata.com> on 2014/07/10 20:50:22 UTC

sparkStaging

in spark 1.0.0 using yarn-client mode i am seeing that the sparkStaging
directories do not get cleaned up.

for example i run:
$ spark-submit --class org.apache.spark.examples.SparkPi
spark-examples-1.0.0-hadoop2.3.0-cdh5.0.2.jar 10

after which i have this directory left behind with one file in it:
$ hadoop fs -ls /user/koert/.sparkStaging/application_1404173427396_0107
Found 1 items
-rw-r--r--   3 koert koert  120100949 2014-07-10 14:47
/user/koert/.sparkStaging/application_1404173427396_0107/spark-assembly-1.0.0-hadoop2.3.0-cdh5.0.2.jar

is this a known issue? there was some talk about this earlier but am not
sure if this is supposed to be resolved or not.

thanks! koert