You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Yanbo Liang (JIRA)" <ji...@apache.org> on 2015/08/16 05:13:45 UTC

[jira] [Comment Edited] (SPARK-10009) PySpark Param of Vector type can be set with Python array or numpy.array

    [ https://issues.apache.org/jira/browse/SPARK-10009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14698523#comment-14698523 ] 

Yanbo Liang edited comment on SPARK-10009 at 8/16/15 3:13 AM:
--------------------------------------------------------------

[~kaisasak] I think what you means is Params which must be set as keyword dictionary type if I understand you correctly. What I mean is Param such as [scalingVec | https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/ElementwiseProduct.scala#L47]


was (Author: yanboliang):
[~kaisasak] I think what you means is Params which must be set as keyword dictionary type if I understand you correctly. What I mean is Param such as [ scalingVec | https://github.com/apache/spark/blob/master/mllib/src/main/scala/org/apache/spark/ml/feature/ElementwiseProduct.scala#L47 ]

> PySpark Param of Vector type can be set with Python array or numpy.array
> ------------------------------------------------------------------------
>
>                 Key: SPARK-10009
>                 URL: https://issues.apache.org/jira/browse/SPARK-10009
>             Project: Spark
>          Issue Type: Improvement
>          Components: ML, PySpark
>            Reporter: Yanbo Liang
>
> If the type of Param in PySpark ML pipeline is Vector, we can set with Vector currently. We also need to support set it with Python array and numpy.array. It should be handled in the wrapper (_transfer_params_to_java).



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org