You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Amit Sela <am...@gmail.com> on 2016/10/27 14:07:31 UTC

Many Spark metric names do not include the application name

Hi guys,


It seems that JvmSource / DAGSchedulerSource / BlockManagerSource
/ ExecutorAllocationManager and other metrics sources (except for the
StreamingSource) publish their metrics directly under the "driver" fragment
(or its executor counter-part) of the metric path without including the
application name.


For instance:

   - "spark.application_xxxx.driver.DAGScheduler.job.allJobs"
   - while I would expect it to be something like:


   - *"*spark.application_xxxx.driver*.myAppName.*DAGScheduler.job.allJobs
   *"*
   - just like it currently is in the *streaming* metrics
   (StreamingSource):


   - "spark.application_xxxx.driver.*myAppName*
   .StreamingMetrics.streaming.lastCompletedBatch_processingDelay"

I was wondering if there is a reason for not including the application name
in the metric path?


Your help would be much appreciated!


Regards,

Amit