You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Reynold Xin (JIRA)" <ji...@apache.org> on 2018/02/02 19:36:01 UTC

[jira] [Commented] (SPARK-21658) Adds the default None for value in na.replace in PySpark to match

    [ https://issues.apache.org/jira/browse/SPARK-21658?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16350848#comment-16350848 ] 

Reynold Xin commented on SPARK-21658:
-------------------------------------

Sorry but I object to this change. Why would we put null as the default replace value, in a function called replace? That seems very counterintuitive and error prone.

I also left a comment in https://github.com/apache/spark/pull/16793

> Adds the default None for value in na.replace in PySpark to match
> -----------------------------------------------------------------
>
>                 Key: SPARK-21658
>                 URL: https://issues.apache.org/jira/browse/SPARK-21658
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.2.0
>            Reporter: Hyukjin Kwon
>            Assignee: Chin Han Yu
>            Priority: Minor
>              Labels: Starter
>             Fix For: 2.3.0
>
>
> Looks {{na.replace}} missed the default value {{None}}.
> Both docs says they are aliases 
> http://spark.apache.org/docs/2.2.0/api/python/pyspark.sql.html#pyspark.sql.DataFrame.replace
> http://spark.apache.org/docs/2.2.0/api/python/pyspark.sql.html#pyspark.sql.DataFrameNaFunctions.replace
> but the default values looks different, which ends up with:
> {code}
> >>> df = spark.createDataFrame([('Alice', 10, 80.0)])
> >>> df.replace({"Alice": "a"}).first()
> Row(_1=u'a', _2=10, _3=80.0)
> >>> df.na.replace({"Alice": "a"}).first()
> Traceback (most recent call last):
>   File "<stdin>", line 1, in <module>
> TypeError: replace() takes at least 3 arguments (2 given)
> {code}
> To take the advantage of SPARK-19454, sounds we should match them.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org