You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Joseph K. Bradley (JIRA)" <ji...@apache.org> on 2016/04/21 01:25:25 UTC

[jira] [Commented] (SPARK-12468) getParamMap in Pyspark ML API returns empty dictionary in example for Documentation

    [ https://issues.apache.org/jira/browse/SPARK-12468?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15250925#comment-15250925 ] 

Joseph K. Bradley commented on SPARK-12468:
-------------------------------------------

Thinking about this more, I think this is a duplicate for [SPARK-10931].  I'm going to close this, but thanks for bringing it up.

> getParamMap in Pyspark ML API returns empty dictionary in example for Documentation
> -----------------------------------------------------------------------------------
>
>                 Key: SPARK-12468
>                 URL: https://issues.apache.org/jira/browse/SPARK-12468
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 1.5.2
>            Reporter: Zachary Brown
>            Priority: Minor
>
> The `extractParamMap()` method for a model that has been fit returns an empty dictionary, e.g. (from the [Pyspark ML API Documentation](http://spark.apache.org/docs/latest/ml-guide.html#example-estimator-transformer-and-param)):
> ```python
> from pyspark.mllib.linalg import Vectors
> from pyspark.ml.classification import LogisticRegression
> from pyspark.ml.param import Param, Params
> # Prepare training data from a list of (label, features) tuples.
> training = sqlContext.createDataFrame([
>     (1.0, Vectors.dense([0.0, 1.1, 0.1])),
>     (0.0, Vectors.dense([2.0, 1.0, -1.0])),
>     (0.0, Vectors.dense([2.0, 1.3, 1.0])),
>     (1.0, Vectors.dense([0.0, 1.2, -0.5]))], ["label", "features"])
> # Create a LogisticRegression instance. This instance is an Estimator.
> lr = LogisticRegression(maxIter=10, regParam=0.01)
> # Print out the parameters, documentation, and any default values.
> print "LogisticRegression parameters:\n" + lr.explainParams() + "\n"
> # Learn a LogisticRegression model. This uses the parameters stored in lr.
> model1 = lr.fit(training)
> # Since model1 is a Model (i.e., a transformer produced by an Estimator),
> # we can view the parameters it used during fit().
> # This prints the parameter (name: value) pairs, where names are unique IDs for this
> # LogisticRegression instance.
> print "Model 1 was fit using parameters: "
> print model1.extractParamMap()
> ```



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org