You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2017/05/09 06:22:04 UTC

[jira] [Assigned] (SPARK-20673) LDA `optimizer` do not really support case insensitive

     [ https://issues.apache.org/jira/browse/SPARK-20673?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-20673:
------------------------------------

    Assignee: Apache Spark

> LDA `optimizer` do not really support case insensitive 
> -------------------------------------------------------
>
>                 Key: SPARK-20673
>                 URL: https://issues.apache.org/jira/browse/SPARK-20673
>             Project: Spark
>          Issue Type: Bug
>          Components: ML
>    Affects Versions: 2.2.0
>            Reporter: zhengruifeng
>            Assignee: Apache Spark
>            Priority: Minor
>
> The declaration of {{optimizer}} in {{LDA}} suggest that it should be case insensitive:
> {code}  
> final val optimizer = new Param[String](this, "optimizer", "Optimizer or inference" +
>     " algorithm used to estimate the LDA model. Supported: " + supportedOptimizers.mkString(", "),
>     (o: String) =>
>       ParamValidators.inArray(supportedOptimizers).apply(o.toLowerCase(Locale.ROOT)))
> {code}
> However, it do not support uppercase:
> {code}
> scala> val lda = new LDA().setOptimizer("Em")
> lda: org.apache.spark.ml.clustering.LDA = lda_dde23ccc5792
> scala> lda.getOptimizer
> res18: String = Em
> scala> lda.fit(dataset)
> scala.MatchError: Em (of class java.lang.String)
>   at org.apache.spark.ml.clustering.LDAParams$class.getOldOptimizer(LDA.scala:351)
>   at org.apache.spark.ml.clustering.LDA.getOldOptimizer(LDA.scala:809)
>   at org.apache.spark.ml.clustering.LDA.fit(LDA.scala:898)
>   ... 54 elided
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org