You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nick Pentreath (JIRA)" <ji...@apache.org> on 2017/02/24 08:22:44 UTC

[jira] [Closed] (SPARK-10041) Proposal of Parameter Server Interface for Spark

     [ https://issues.apache.org/jira/browse/SPARK-10041?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nick Pentreath closed SPARK-10041.
----------------------------------
    Resolution: Won't Fix

> Proposal of Parameter Server Interface for Spark
> ------------------------------------------------
>
>                 Key: SPARK-10041
>                 URL: https://issues.apache.org/jira/browse/SPARK-10041
>             Project: Spark
>          Issue Type: New Feature
>          Components: ML, MLlib
>            Reporter: Yi Liu
>         Attachments: Proposal of Parameter Server Interface for Spark - v1.pdf
>
>
> Many large-scale machine learning algorithms (logistic regression, LDA, neural network, etc.) have been built on top of Apache Spark. As discussed in SPARK-4590, a Parameter Server (PS) architecture can greatly improve the scalability and efficiency for these large-scale machine learning. There are some previous discussions on possible Parameter Server implementations inside Spark (e.g., SPARK-6932). However, at this stage we believe it is more important for the community to first define the proper interface of Parameter Server, which can be decoupled from the actual PS implementations; consequently, it is possible to support different implementations of Parameter Servers in Spark later. The attached document contains our initial proposal of Parameter Server interface for ML algorithms on Spark, including data model, supported operations, epoch support and possible Spark integrations.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org