You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yan Facai (颜发才 JIRA)" <ji...@apache.org> on 2017/03/24 01:58:42 UTC

[jira] [Issue Comment Deleted] (SPARK-20043) CrossValidatorModel loader does not recognize impurity "Gini" and "Entropy" on ML random forest and decision. Only "gini" and "entropy" (in lower case) are accepted

     [ https://issues.apache.org/jira/browse/SPARK-20043?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Yan Facai (颜发才) updated SPARK-20043:
------------------------------------
    Comment: was deleted

(was: [~zsellami] could you give an example of your code?

I try to reproduce the bug, 
```scala
    val dt = new DecisionTreeRegressor()
    
    val paramMaps = new ParamGridBuilder()
        .addGrid(dt.impurity, Array("Gini", "Entropy"))
        .build()
```
however, IiiegalArgumentException is thrown as Gini is not a valid parameter.)

> CrossValidatorModel loader does not recognize impurity "Gini" and "Entropy" on ML random forest and decision. Only "gini" and "entropy" (in lower case) are accepted
> --------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-20043
>                 URL: https://issues.apache.org/jira/browse/SPARK-20043
>             Project: Spark
>          Issue Type: Bug
>          Components: ML
>    Affects Versions: 2.1.0
>            Reporter: Zied Sellami
>              Labels: starter
>
> I saved a CrossValidatorModel with a decision tree and a random forest. I use Paramgrid to test "gini" and "entropy" impurity. CrossValidatorModel are not able to load the saved model, when impurity are written not in lowercase. I obtain an error from Spark "impurity Gini (Entropy) not recognized.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org