You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Timothy Hunter (JIRA)" <ji...@apache.org> on 2016/07/11 20:14:10 UTC

[jira] [Updated] (SPARK-16485) Additional fixes to Mllib 2.0 documentation

     [ https://issues.apache.org/jira/browse/SPARK-16485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Timothy Hunter updated SPARK-16485:
-----------------------------------
    Description: 
While reviewing the documentation of MLlib, I found some additional issues.

Important issues that affect the binary signatures:
 - GBTClassificationModel: all the setters should be overriden
 - LogisticRegressionModel: setThreshold(s)
 - RandomForestClassificationModel: all the setters should be overriden
 - org.apache.spark.ml.stat.distribution.MultivariateGaussian is exposed but most of the methods are private[ml] -> do we need to expose this class for now?
- GeneralizedLinearRegressionModel: linkObj, familyObj, familyAndLink should not be exposed
- sqlDataTypes: name does not follow conventions. Do we need to expose it?

Issues that involve only documentation:
- Evaluator:
  1. inconsistent doc between evaluate and isLargerBetter
- MinMaxScaler: math rendering
- GeneralizedLinearRegressionSummary: aic doc is incorrect


The reference documentation that was used was:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/

  was:
While reviewing the documentation of MLlib, I found some additional issues.

Important issues that affect the binary signatures:
 - GBTClassificationModel: all the setters should be overriden
 - LogisticRegressionModel: setThreshold(s)
 - RandomForestClassificationModel: all the setters should be overriden
 - org.apache.spark.ml.stat.distribution.MultivariateGaussian is exposed but most of the methods are private[ml] -> do we need to expose this class for now?
- GeneralizedLinearRegressionModel: linkObj, familyObj, familyAndLink should not be exposed
- sqlDataTypes: name does not follow conventions. Do we need to expose it?

Issues that involve only documentation:
- Evaluator:
  1. inconsistent doc between evaluate and isLargerBetter
  2. missing `def evaluate(dataset: Dataset[_]): Double` from the doc (the other method with the same name shows up). This may be a bug in scaladoc.
- MinMaxScaler: math rendering
- GeneralizedLinearRegressionSummary: aic doc is incorrect


The reference documentation that was used was:
http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/


> Additional fixes to Mllib 2.0 documentation
> -------------------------------------------
>
>                 Key: SPARK-16485
>                 URL: https://issues.apache.org/jira/browse/SPARK-16485
>             Project: Spark
>          Issue Type: Sub-task
>          Components: Documentation, GraphX, ML, MLlib, SparkR
>            Reporter: Timothy Hunter
>
> While reviewing the documentation of MLlib, I found some additional issues.
> Important issues that affect the binary signatures:
>  - GBTClassificationModel: all the setters should be overriden
>  - LogisticRegressionModel: setThreshold(s)
>  - RandomForestClassificationModel: all the setters should be overriden
>  - org.apache.spark.ml.stat.distribution.MultivariateGaussian is exposed but most of the methods are private[ml] -> do we need to expose this class for now?
> - GeneralizedLinearRegressionModel: linkObj, familyObj, familyAndLink should not be exposed
> - sqlDataTypes: name does not follow conventions. Do we need to expose it?
> Issues that involve only documentation:
> - Evaluator:
>   1. inconsistent doc between evaluate and isLargerBetter
> - MinMaxScaler: math rendering
> - GeneralizedLinearRegressionSummary: aic doc is incorrect
> The reference documentation that was used was:
> http://people.apache.org/~pwendell/spark-releases/spark-2.0.0-rc2-docs/



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org