You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Steve Lewis <lo...@gmail.com> on 2014/12/04 02:05:46 UTC

How can a function running on a slave access the Executor

 I have been working on balancing work across a number of partitions and
find it would be useful to access information about the current execution
environment much of which (like Executor ID) are available if there was a
way to get the current executor or the Hadoop TaskAttempt context -
does any one on the list know how to access this object from a function
running on a slave. Currently I am reduced to tracking Mac Address to at
least know which machine code is running on but there must be a better way