You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Takuya Ueshin (JIRA)" <ji...@apache.org> on 2017/06/03 05:58:04 UTC

[jira] [Resolved] (SPARK-19732) DataFrame.fillna() does not work for bools in PySpark

     [ https://issues.apache.org/jira/browse/SPARK-19732?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Takuya Ueshin resolved SPARK-19732.
-----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

Issue resolved by pull request 18164
[https://github.com/apache/spark/pull/18164]

> DataFrame.fillna() does not work for bools in PySpark
> -----------------------------------------------------
>
>                 Key: SPARK-19732
>                 URL: https://issues.apache.org/jira/browse/SPARK-19732
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>    Affects Versions: 2.1.0
>            Reporter: Len Frodgers
>            Priority: Minor
>             Fix For: 2.3.0
>
>
> In PySpark, the fillna function of DataFrame inadvertently casts bools to ints, so fillna cannot be used to fill True/False.
> e.g. `spark.createDataFrame([Row(a=True),Row(a=None)]).fillna(True).collect()` 
> yields
> `[Row(a=True), Row(a=None)]`
> It should be a=True for the second Row
> The cause is this bit of code: 
> {code}
> if isinstance(value, (int, long)):
>             value = float(value)
> {code}
> There needs to be a separate check for isinstance(bool), since in python, bools are ints too
> Additionally there's another anomaly:
> Spark (and pyspark) supports filling of bools if you specify the args as a map: 
> {code}
> fillna({"a": False})
> {code}
> , but not if you specify it as
> {code}
> fillna(False)
> {code}
> This is because (scala-)Spark has no
> {code}
> def fill(value: Boolean): DataFrame = fill(value, df.columns)
> {code}
>  method. I find that strange/buggy



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org