You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Wang Jiaye <ru...@gmail.com> on 2016/05/20 03:35:47 UTC

Query about how to estimate cpu usage for spark

For MR job, there is job counter to provide CPU ms information while I
cannot find a similar metrics in Spark which is quite useful. Do anyone
know about this?

Re: Query about how to estimate cpu usage for spark

Posted by Mich Talebzadeh <mi...@gmail.com>.
Please note taht CPU usage varies with time, it is not a fixed value

First have a  look at spark GUI that runs under port 4040 under tab jobs

Then use jps to identify the spark process

jps|grep SparkSubmit

Using the process name start jmonitor  on the OS and specify SparkSubmit
process, It will show CPU, memory, Heap usage plotted against time

[image: Inline images 1]

HTH



Dr Mich Talebzadeh



LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com



On 20 May 2016 at 04:35, Wang Jiaye <ru...@gmail.com> wrote:

> For MR job, there is job counter to provide CPU ms information while I
> cannot find a similar metrics in Spark which is quite useful. Do anyone
> know about this?
>