You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "luzengxiang (JIRA)" <ji...@apache.org> on 2019/02/15 10:18:00 UTC

[jira] [Commented] (SPARK-26886) Proper termination of external processes launched by the worker

    [ https://issues.apache.org/jira/browse/SPARK-26886?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16769165#comment-16769165 ] 

luzengxiang commented on SPARK-26886:
-------------------------------------

[~mengxr] Let's discuss about it.

> Proper termination of external processes launched by the worker
> ---------------------------------------------------------------
>
>                 Key: SPARK-26886
>                 URL: https://issues.apache.org/jira/browse/SPARK-26886
>             Project: Spark
>          Issue Type: New JIRA Project
>          Components: Spark Core
>    Affects Versions: 2.4.0
>            Reporter: luzengxiang
>            Priority: Minor
>
> When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. 
> {quote}val nothing = inputData.barrier().mapPartitions
>  {_ => 
>  val barrierTask = BarrierTaskContext.get()
>  // save data to disk barrierTask.barrier()
>  barrierTask.barrier()
>  // launch external process, eg MPI Task + TensorFlow
>  }
> {quote}
>  
> The problem is that external process remains running when spark task is killed manually. This Jira is the place to talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org