You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (JIRA)" <ji...@apache.org> on 2018/08/24 02:09:00 UTC
[jira] [Resolved] (SPARK-25210) spark driver apply task success
info cost much time
[ https://issues.apache.org/jira/browse/SPARK-25210?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-25210.
----------------------------------
Resolution: Invalid
Let's reopen when it's clear if that's an issue.
> spark driver apply task success info cost much time
> ----------------------------------------------------
>
> Key: SPARK-25210
> URL: https://issues.apache.org/jira/browse/SPARK-25210
> Project: Spark
> Issue Type: Bug
> Components: Scheduler
> Affects Versions: 2.1.0
> Reporter: wangminfeng
> Priority: Major
>
> when i run spark job by standalone, i have 600 ins and every ins can run 50 task.
> driver send message to executor, executor immediately receive the message and run task.
> but when executor finish the task, send messge to driver, driver waits much time to get the messge, like this:
> driver start task,
> 18/08/23 20:20:11 DEBUG scheduler.cluster.YarnSchedulerBackend$YarnDriverEndpoint [dispatcher-event-loop-37]: Launching task 345733 on executor id: 261 hostname: yq01-inf-wangzeyu081.yq01.baidu.com
> executor start task,
> 18/08/23 20:20:11 INFO CoarseGrainedExecutorBackend: Got assigned task 345733
> there is no delay
>
> executor finish task,
> 18/08/23 20:20:17 INFO Executor: Finished task 79.0 in stage 96.0 (TID 345733). 1573 bytes result sent to driver
> 18/08/23 20:20:23 INFO spark.scheduler.TaskSetManager [task-result-getter-19]: Finished task 79.0 in stage 96.0 (TID 345733) in 15243 ms on yq01-inf-wangzeyu081.yq01.baidu.com (executor 261) (33095/62500)
> there is 6s delay
>
> it is sure that no net problem.
>
> so, how can reduce this delay
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org