You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by atalay <at...@yahoo.com> on 2015/06/04 00:01:17 UTC

Cleaning up workers' directories automatically

Hi everyone,
everytime our data comes and new updates occur in our cluster, an
undesirable file is being created in workers' directories.In order to
cleanup automatically I changed the variable value Spark (Standalone) Client
Advanced Configuration Snippet (Safety Valve) for spark-conf/spark-env.sh in
Gateway Default Group->Advanced Settings  as :

/*export SPARK_WORKER_OPTS="-Dspark.worker.cleanup.enabled=true
-Dspark.worker.cleanup.interval=60 -Dspark.worker.cleanup.appDataTtl=60"*/

by using cloudera manager.
After i make the cluster restart, it makes change in spark/conf/spark-env.sh 
but  it does not make cleanup.Does anyone know where the mistake is or
another way of cleaning up automatically ?
i am using CDH 4 and Spark 1.2.2 in the cluster.



--
View this message in context: http://apache-spark-developers-list.1001551.n3.nabble.com/Cleaning-up-workers-directories-automatically-tp12597.html
Sent from the Apache Spark Developers List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscribe@spark.apache.org
For additional commands, e-mail: dev-help@spark.apache.org