You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2019/10/15 17:54:29 UTC

[GitHub] [spark] BryanCutler commented on issue #26130: [SPARK-29464][PYTHON][ML] PySpark ML should expose Params.clear() to unset a user supplied Param

BryanCutler commented on issue #26130: [SPARK-29464][PYTHON][ML] PySpark ML should expose Params.clear() to unset a user supplied Param
URL: https://github.com/apache/spark/pull/26130#issuecomment-542331326
 
 
   @huaxingao I think there is a problem when the Java object has the param set, then it is cleared in Python but not Java. For example:
   
   ```python
   In [4]:     >>> from pyspark.ml.linalg import Vectors 
      ...:     >>> df = spark.createDataFrame([(Vectors.dense([1.0]),), (Vectors.dense([2.0]),)], ["a"])
      ...:     >>> maScaler = MaxAbsScaler(inputCol="a", outputCol="scaled") 
      ...:     >>> model = maScaler.fit(df) 
      ...:     >>> model.setOutputCol("scaledOutput")                                                  
   Out[4]: MaxAbsScaler_f91118f1dd81                                               
   
   In [7]: model.clear(model.outputCol)                                                               
   
   In [9]: model.getOutputCol()                                                                        
   Out[9]: 'MaxAbsScaler_f91118f1dd81__output'
   
   In [11]: model.transform(df).show()                                                                 
   +-----+------+
   |    a|scaled|
   +-----+------+
   |[1.0]| [0.5]|
   |[2.0]| [1.0]|
   +-----+------+
   ```

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
users@infra.apache.org


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org