You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Mingyu Kim <mk...@palantir.com> on 2014/01/21 12:24:28 UTC

How to clean up jars on worker nodes

Hi all,

I¹d like the added jars on worker nodes (i.e. SparkContext.addJar()) to be
cleaned up on tear down. However, SparkContext.stop() doesn¹t seem to delete
them. What would be the best way to clear them? Or, is there an easy way to
add this functionality?

Mingyu