You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "zhengruifeng (Jira)" <ji...@apache.org> on 2020/03/20 03:11:00 UTC

[jira] [Assigned] (SPARK-30931) ML 3.0 QA: API: Python API coverage

     [ https://issues.apache.org/jira/browse/SPARK-30931?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

zhengruifeng reassigned SPARK-30931:
------------------------------------

    Assignee: Huaxin Gao

> ML 3.0 QA: API: Python API coverage
> -----------------------------------
>
>                 Key: SPARK-30931
>                 URL: https://issues.apache.org/jira/browse/SPARK-30931
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation, ML, MLlib, PySpark
>    Affects Versions: 3.0.0
>            Reporter: zhengruifeng
>            Assignee: Huaxin Gao
>            Priority: Major
>
> For new public APIs added to MLlib ({{spark.ml}} only), we need to check the generated HTML doc and compare the Scala & Python versions.
>  * *GOAL*: Audit and create JIRAs to fix in the next release.
>  * *NON-GOAL*: This JIRA is _not_ for fixing the API parity issues.
> We need to track:
>  * Inconsistency: Do class/method/parameter names match?
>  * Docs: Is the Python doc missing or just a stub? We want the Python doc to be as complete as the Scala doc.
>  * API breaking changes: These should be very rare but are occasionally either necessary (intentional) or accidental. These must be recorded and added in the Migration Guide for this release.
>  ** Note: If the API change is for an Alpha/Experimental/DeveloperApi component, please note that as well.
>  * Missing classes/methods/parameters: We should create to-do JIRAs for functionality missing from Python, to be added in the next release cycle. *Please use a _separate_ JIRA (linked below as "requires") for this list of to-do items.*



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org