You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2016/04/29 00:14:13 UTC

[jira] [Updated] (SPARK-14815) ML, Graph, R 2.0 QA: Update user guide for new APIs

     [ https://issues.apache.org/jira/browse/SPARK-14815?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Joseph K. Bradley updated SPARK-14815:
--------------------------------------
    Component/s: SparkR
                 GraphX

> ML, Graph, R 2.0 QA: Update user guide for new APIs
> ---------------------------------------------------
>
>                 Key: SPARK-14815
>                 URL: https://issues.apache.org/jira/browse/SPARK-14815
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation, GraphX, ML, MLlib, SparkR
>            Reporter: Joseph K. Bradley
>
> Check the user guide vs. a list of new APIs (classes, methods, data members) to see what items require updates to the user guide.
> For each feature missing user guide doc:
> * Create a JIRA for that feature, and assign it to the author of the feature
> * Link it to (a) the original JIRA which introduced that feature ("related to") and (b) to this JIRA ("requires").
> Note: Now that we have algorithms in spark.ml which are not in spark.mllib, we should make subsections for the spark.ml API as needed. We can follow the structure of the spark.mllib user guide.
> * The spark.ml user guide can provide: (a) code examples and (b) info on algorithms which do not exist in spark.mllib.
> * Since spark.ml is becoming the primary API in 2.0, we should copy algorithm details from the spark.mllib guide to spark.ml as needed, rather than just linking back to the corresponding algorithms in the spark.mllib user guide.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org