You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2019/10/08 05:44:17 UTC

[jira] [Resolved] (SPARK-24405) parameter for python worker timeout

     [ https://issues.apache.org/jira/browse/SPARK-24405?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon resolved SPARK-24405.
----------------------------------
    Resolution: Incomplete

> parameter for python worker timeout
> -----------------------------------
>
>                 Key: SPARK-24405
>                 URL: https://issues.apache.org/jira/browse/SPARK-24405
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.3.0
>            Reporter: Deepansh
>            Priority: Minor
>              Labels: bulk-closed
>
> Currently, the python worker timeout is set to 60000 ms in code and cannot be changed. I am developing a pure python application, but workers die after 1min of inactivity. And new activity after 1min makes new python workers and sets the environment again. Can we make this parameter configurable with the default being set to 1min so that pure python applications are faster?
> parameter = "IDLE_WORKER_TIMEOUT_MS"
> file =[PythonWorkerFactory.scala|https://github.com/apache/spark/blob/628c7b517969c4a7ccb26ea67ab3dd61266073ca/core/src/main/scala/org/apache/spark/api/python/PythonWorkerFactory.scala]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org