You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nick Pentreath (JIRA)" <ji...@apache.org> on 2016/12/08 07:27:58 UTC
[jira] [Updated] (SPARK-18320) ML 2.1 QA: API: Python API coverage
[ https://issues.apache.org/jira/browse/SPARK-18320?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Nick Pentreath updated SPARK-18320:
-----------------------------------
Fix Version/s: (was: 2.1.1)
(was: 2.2.0)
2.1.0
> ML 2.1 QA: API: Python API coverage
> -----------------------------------
>
> Key: SPARK-18320
> URL: https://issues.apache.org/jira/browse/SPARK-18320
> Project: Spark
> Issue Type: Sub-task
> Components: Documentation, ML, PySpark
> Reporter: Joseph K. Bradley
> Assignee: Seth Hendrickson
> Priority: Blocker
> Fix For: 2.1.0
>
>
> For new public APIs added to MLlib ({{spark.ml}} only), we need to check the generated HTML doc and compare the Scala & Python versions.
> * *GOAL*: Audit and create JIRAs to fix in the next release.
> * *NON-GOAL*: This JIRA is _not_ for fixing the API parity issues.
> We need to track:
> * Inconsistency: Do class/method/parameter names match?
> * Docs: Is the Python doc missing or just a stub? We want the Python doc to be as complete as the Scala doc.
> * API breaking changes: These should be very rare but are occasionally either necessary (intentional) or accidental. These must be recorded and added in the Migration Guide for this release.
> ** Note: If the API change is for an Alpha/Experimental/DeveloperApi component, please note that as well.
> * Missing classes/methods/parameters: We should create to-do JIRAs for functionality missing from Python, to be added in the next release cycle. *Please use a _separate_ JIRA (linked below as "requires") for this list of to-do items.*
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org