You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Felix Cheung (JIRA)" <ji...@apache.org> on 2018/06/05 01:48:00 UTC

[jira] [Resolved] (SPARK-24403) reuse r worker

     [ https://issues.apache.org/jira/browse/SPARK-24403?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Felix Cheung resolved SPARK-24403.
----------------------------------
    Resolution: Duplicate

> reuse r worker
> --------------
>
>                 Key: SPARK-24403
>                 URL: https://issues.apache.org/jira/browse/SPARK-24403
>             Project: Spark
>          Issue Type: Improvement
>          Components: SparkR
>    Affects Versions: 2.3.0
>            Reporter: Deepansh
>            Priority: Major
>              Labels: sparkR
>
> Currently, SparkR doesn't support reuse of its workers, so broadcast and closure are transferred to workers each time. Can we add the idea of python worker reuse to SparkR also, to enhance its performance?
> performance issues reference [https://issues.apache.org/jira/browse/SPARK-23650]



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org