You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2015/12/11 23:55:46 UTC

[jira] [Commented] (SPARK-11606) ML 1.6 QA: Update user guide for new APIs

    [ https://issues.apache.org/jira/browse/SPARK-11606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15053734#comment-15053734 ] 

Joseph K. Bradley commented on SPARK-11606:
-------------------------------------------

I'm going to split off the remaining guide sections into a new umbrella JIRA so that I can close this one.

> ML 1.6 QA: Update user guide for new APIs
> -----------------------------------------
>
>                 Key: SPARK-11606
>                 URL: https://issues.apache.org/jira/browse/SPARK-11606
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation, ML, MLlib
>            Reporter: Joseph K. Bradley
>            Assignee: Joseph K. Bradley
>
> Check the user guide vs. a list of new APIs (classes, methods, data members) to see what items require updates to the user guide.
> For each feature missing user guide doc:
> * Create a JIRA for that feature, and assign it to the author of the feature
> * Link it to (a) the original JIRA which introduced that feature ("related to") and (b) to this JIRA ("requires").
> Note: Now that we have algorithms in spark.ml which are not in spark.mllib, we should make subsections for the spark.ml API as needed. We can follow the structure of the spark.mllib user guide.
> * The spark.ml user guide can provide: (a) code examples and (b) info on algorithms which do not exist in spark.mllib.
> * We should not duplicate info in the spark.ml guides. Since spark.mllib is still the primary API, we should provide links to the corresponding algorithms in the spark.mllib user guide for more info.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org