You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@spark.apache.org by Praups Kumar <pr...@gmail.com> on 2019/07/24 11:18:04 UTC

How to get Peak CPU Utilization Rate in Spark

Hi Spark dev

Map Reduce can be  used ResourceCalculatorProcessTree in Task.java to
get peakCPUUTilization.


*The same is done at Yarn node manager level in class *
ContainersMonitorsImpl

However , I am not able to find any way to get peakCPUUtilization of the
executor in spark .

Please help me on the same.