You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2016/11/14 11:39:58 UTC

[jira] [Commented] (SPARK-18434) Add missing ParamValidations for ML algos

    [ https://issues.apache.org/jira/browse/SPARK-18434?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15663701#comment-15663701 ] 

Apache Spark commented on SPARK-18434:
--------------------------------------

User 'zhengruifeng' has created a pull request for this issue:
https://github.com/apache/spark/pull/15881

> Add missing ParamValidations for ML algos
> -----------------------------------------
>
>                 Key: SPARK-18434
>                 URL: https://issues.apache.org/jira/browse/SPARK-18434
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML
>            Reporter: zhengruifeng
>            Priority: Minor
>
> add ParamValidations to make following lines fail:
> {code}
> scala> val idf = new IDF().setMinDocFreq(-100)
> idf: org.apache.spark.ml.feature.IDF = idf_4d2e6a4f2361
> scala> val pca = new PCA().setK(-100)
> pca: org.apache.spark.ml.feature.PCA = pca_7b22fbec5e97
> scala> val w2v = new Word2Vec().setVectorSize(-100)
> w2v: org.apache.spark.ml.feature.Word2Vec = w2v_06be869a20d9
> scala> val iso = new IsotonicRegression().setFeatureIndex(-100)
> iso: org.apache.spark.ml.regression.IsotonicRegression = isoReg_b9c59e9b6cbd
> scala> val lir = new LinearRegression().setSolver("1234")
> lir: org.apache.spark.ml.regression.LinearRegression = linReg_4e3c1c5e2904
> scala> val rfc = new RandomForestClassifier().setMinInfoGain(-100)
> rfc: org.apache.spark.ml.classification.RandomForestClassifier = rfc_6db27a737216
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org