You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2015/05/08 07:57:00 UTC
[jira] [Assigned] (SPARK-7474) ParamGridBuilder's doctest doesn't
show up correctly in the generated doc
[ https://issues.apache.org/jira/browse/SPARK-7474?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-7474:
-----------------------------------
Assignee: Apache Spark (was: Xiangrui Meng)
> ParamGridBuilder's doctest doesn't show up correctly in the generated doc
> -------------------------------------------------------------------------
>
> Key: SPARK-7474
> URL: https://issues.apache.org/jira/browse/SPARK-7474
> Project: Spark
> Issue Type: Documentation
> Components: Documentation, ML
> Affects Versions: 1.4.0
> Reporter: Xiangrui Meng
> Assignee: Apache Spark
>
> {code}
> >>> from classification import LogisticRegression
> >>> lr = LogisticRegression()
> >>> output = ParamGridBuilder().baseOn({lr.labelCol: 'l'}) .baseOn([lr.predictionCol, 'p']) .addGrid(lr.regParam, [1.0, 2.0, 3.0]) .addGrid(lr.maxIter, [1, 5]) .addGrid(lr.featuresCol, ['f']) .build()
> >>> expected = [ {lr.regParam: 1.0, lr.featuresCol: 'f', lr.maxIter: 1, lr.labelCol: 'l', lr.predictionCol: 'p'}, {lr.regParam: 2.0, lr.featuresCol: 'f', lr.maxIter: 1, lr.labelCol: 'l', lr.predictionCol: 'p'}, {lr.regParam: 3.0, lr.featuresCol: 'f', lr.maxIter: 1, lr.labelCol: 'l', lr.predictionCol: 'p'}, {lr.regParam: 1.0, lr.featuresCol: 'f', lr.maxIter: 5, lr.labelCol: 'l', lr.predictionCol: 'p'}, {lr.regParam: 2.0, lr.featuresCol: 'f', lr.maxIter: 5, lr.labelCol: 'l', lr.predictionCol: 'p'}, {lr.regParam: 3.0, lr.featuresCol: 'f', lr.maxIter: 5, lr.labelCol: 'l', lr.predictionCol: 'p'}]
> >>> len(output) == len(expected)
> True
> >>> all([m in expected for m in output])
> True
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org