You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xiangrui Meng (JIRA)" <ji...@apache.org> on 2015/05/07 01:16:59 UTC

[jira] [Resolved] (SPARK-5995) Make ML Prediction Developer APIs public

     [ https://issues.apache.org/jira/browse/SPARK-5995?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Xiangrui Meng resolved SPARK-5995.
----------------------------------
       Resolution: Fixed
    Fix Version/s: 1.4.0

Issue resolved by pull request 5913
[https://github.com/apache/spark/pull/5913]

> Make ML Prediction Developer APIs public
> ----------------------------------------
>
>                 Key: SPARK-5995
>                 URL: https://issues.apache.org/jira/browse/SPARK-5995
>             Project: Spark
>          Issue Type: Sub-task
>          Components: ML
>    Affects Versions: 1.3.0
>            Reporter: Joseph K. Bradley
>            Assignee: Joseph K. Bradley
>             Fix For: 1.4.0
>
>
> Previously, some Developer APIs were added to spark.ml for classification and regression to make it easier to add new algorithms and models: [SPARK-4789]  There are ongoing discussions about the best design of the API.  This JIRA is to continue that discussion and try to finalize those Developer APIs so that they can be made public.
> Please see [this design doc from SPARK-4789 | https://docs.google.com/document/d/1BH9el33kBX8JiDdgUJXdLW14CA2qhTCWIG46eXZVoJs] for details on the original API design.
> Some issues under debate:
> * Should there be strongly typed APIs for fit()?
> ** Proposal: No
> * Should the strongly typed API for transform() be public (vs. protected)?
> ** Proposal: Protected for now
> * What transformation methods should the API make developers implement for classification?
> ** Proposal: See design doc
> * Should there be a way to transform a single Row (instead of only DataFrames)?
> ** Proposal: Not for now



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org