You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2016/11/07 19:15:59 UTC

[jira] [Commented] (SPARK-18316) Spark MLlib, GraphX 2.1 QA umbrella

    [ https://issues.apache.org/jira/browse/SPARK-18316?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15645121#comment-15645121 ] 

Joseph K. Bradley commented on SPARK-18316:
-------------------------------------------

Time to start another round of QA!  I'm working on cloning tasks from 2.0, but with a few improvements.  One big change: I'm splitting the SparkR tasks off from MLlib/GraphX since they are fairly different.

> Spark MLlib, GraphX 2.1 QA umbrella
> -----------------------------------
>
>                 Key: SPARK-18316
>                 URL: https://issues.apache.org/jira/browse/SPARK-18316
>             Project: Spark
>          Issue Type: Umbrella
>          Components: Documentation, GraphX, ML, MLlib
>            Reporter: Joseph K. Bradley
>            Assignee: Joseph K. Bradley
>            Priority: Critical
>
> h1.  WIP - still being updated!
> This JIRA lists tasks for the next Spark release's QA period for MLlib, GraphX, and SparkR.
> The list below gives an overview of what is involved, and the corresponding JIRA issues are linked below that.
> h2. API
> * Check binary API compatibility for Scala/Java
> * Audit new public APIs (from the generated html doc)
> ** Scala
> ** Java compatibility
> ** Python coverage
> ** R
> * Check Experimental, DeveloperApi tags
> h2. Algorithms and performance
> *Performance*
> * _List any other missing performance tests from spark-perf here_
> * perf-tests for transformers (SPARK-2838)
> * MultilayerPerceptron (SPARK-11911)
> h2. Documentation and example code
> * For new algorithms, create JIRAs for updating the user guide sections & examples
> * Update Programming Guide
> * Update website



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org