You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "luzengxiang (JIRA)" <ji...@apache.org> on 2019/02/15 07:38:00 UTC

[jira] [Updated] (SPARK-26886) Proper termination of external processes launched by the worker

     [ https://issues.apache.org/jira/browse/SPARK-26886?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

luzengxiang updated SPARK-26886:
--------------------------------
    Description: 
When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. 

val nothing = inputData.barrier().mapPartitions{ _ =>

      val barrierTask = BarrierTaskContext.get()

        //save data to disk

       barrierTask.barrier()

       //launch external process, eg MPI Task + TensorFlow
}


This Jira is talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.

  was:
When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. 

 {quote}
val nothing = inputData.barrier().mapPartitions{ _ =>

      val barrierTask = BarrierTaskContext.get()

        //save data to disk

       barrierTask.barrier()

       //launch external process, eg MPI Task + TensorFlow
}
{quote}

This Jira is talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.


> Proper termination of external processes launched by the worker
> ---------------------------------------------------------------
>
>                 Key: SPARK-26886
>                 URL: https://issues.apache.org/jira/browse/SPARK-26886
>             Project: Spark
>          Issue Type: New JIRA Project
>          Components: Spark Core
>    Affects Versions: 2.4.0
>            Reporter: luzengxiang
>            Priority: Minor
>
> When Embedding Deeplearning Framework in spark, spark worker has to launch external process(eg. MPI task) in some cases. 
> val nothing = inputData.barrier().mapPartitions{ _ =>
>       val barrierTask = BarrierTaskContext.get()
>         //save data to disk
>        barrierTask.barrier()
>        //launch external process, eg MPI Task + TensorFlow
> }
> This Jira is talk about properly terminating external processes launched by spark worker, when spark task is killed or interrupt.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org