You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Jiří Syrový <sy...@gmail.com> on 2017/03/17 10:17:03 UTC
org.apache.spark.ui.jobs.UIData$TaskMetricsUIData
Hi,
is there a good way how to get rid of UIData completely? I have switched
off UI, decreased retainedXXX to minimum, but still there seems to be a lot
of instances of this class ($SUBJ) held in memory. Any ideas?
Thanks,
J. S.
spark {
master = "local[2]"
master = ${?SPARK_MASTER}
info = ""
info = ${?SPARK_INFO_URI}
jobs = ""
jobs = ${?SPARK_JOBS}
jars.packages = "org.elasticsearch:elasticsearch-spark-20_2.11:5.0.1"
submit.deployMode = "cluster"
sql.crossJoin.enabled = true
executor.memory = "4g"
executor.memory = ${?SPARK_EXECUTOR_MEMORY}
executor.cores = 2
shuffle.service.enabled = true
dynamicAllocation.enabled = true
ui.enabled = false
ui.retainedJobs = 100
ui.retainedStages = 100
ui.retainedTasks = 3000
sql.retainedExecutions = 100
}
num #instances #bytes class name
----------------------------------------------
1: 4011 1563354608 [J
2: 133214 185564000 [B
3: 449373 140445216 [C
4: 5481059 131545416 scala.Tuple2
5: 5429700 130312800 java.lang.Long
6: 238037 36071536 [Ljava.lang.Object;
7: 148048 16581376
org.apache.spark.ui.jobs.UIData$TaskMetricsUIData
8: 545859 13100616 scala.collection.immutable.$colon$colon
9: 148048 11843840
org.apache.spark.ui.jobs.UIData$ShuffleReadMetricsUIData