You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-dev@hadoop.apache.org by "wei (JIRA)" <ji...@apache.org> on 2017/09/21 03:08:00 UTC
[jira] [Created] (MAPREDUCE-6963) MR Map or Reduce specified node
label is missing due to concurrent task limits
wei created MAPREDUCE-6963:
------------------------------
Summary: MR Map or Reduce specified node label is missing due to concurrent task limits
Key: MAPREDUCE-6963
URL: https://issues.apache.org/jira/browse/MAPREDUCE-6963
Project: Hadoop Map/Reduce
Issue Type: Bug
Components: mr-am
Affects Versions: 2.8.1
Reporter: wei
In RMContainerAllocator#applyConcurrentTaskLimits, we limit the degree of task parallelism but not consider the node label specified for the ResourceRequest.
{code:java}
private void applyConcurrentTaskLimits() {
.......
setRequestLimit(PRIORITY_FAST_FAIL_MAP, mapResourceRequest,
failedMapRequestLimit);
setRequestLimit(PRIORITY_MAP, mapResourceRequest, normalMapRequestLimit);
}
.......
setRequestLimit(PRIORITY_REDUCE, reduceResourceRequest,
reduceRequestLimit);
}
}
{code}
Then we call the applyRequestLimits in RMContainerAllocator#makeRemoteRequest to apply the request limits. When the req.getNumContainers() > limit conditions are met,the original ResourceRequest(ask) will be replaced by the reqLimit which was generated by the above.
{code:java}
private void applyRequestLimits() {
.....
// update an existing ask or send a new one if updating
if (ask.remove(req) || requestLimitsToUpdate.contains(req)) {
ResourceRequest newReq = req.getNumContainers() > limit
? reqLimit : req;
ask.add(newReq);
LOG.info("Applying ask limit of " + newReq.getNumContainers()
+ " for priority:" + reqLimit.getPriority()
+ " and capability:" + reqLimit.getCapability());
}
.......
}
}
{code}
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: mapreduce-dev-unsubscribe@hadoop.apache.org
For additional commands, e-mail: mapreduce-dev-help@hadoop.apache.org