You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/01/19 09:24:00 UTC
[jira] [Assigned] (SPARK-30530) CSV load followed by "is null"
filter produces incorrect results
[ https://issues.apache.org/jira/browse/SPARK-30530?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Wenchen Fan reassigned SPARK-30530:
-----------------------------------
Assignee: Maxim Gekk
> CSV load followed by "is null" filter produces incorrect results
> ----------------------------------------------------------------
>
> Key: SPARK-30530
> URL: https://issues.apache.org/jira/browse/SPARK-30530
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 3.0.0
> Reporter: Jason Darrell Lowe
> Assignee: Maxim Gekk
> Priority: Major
>
> Trying to filter on is null from values loaded from a CSV file has regressed recently and now produces incorrect results.
> Given a CSV file with the contents:
> {noformat:title=floats.csv}
> 100.0,1.0,
> 200.0,,
> 300.0,3.0,
> 1.0,4.0,
> ,4.0,
> 500.0,,
> ,6.0,
> -500.0,50.5
> {noformat}
> Filtering this data for the first column being null should return exactly two rows, but it is returning extraneous rows with nulls:
> {noformat}
> scala> val schema = StructType(Array(StructField("floats", FloatType, true),StructField("more_floats", FloatType, true)))
> schema: org.apache.spark.sql.types.StructType = StructType(StructField(floats,FloatType,true), StructField(more_floats,FloatType,true))
> scala> val df = spark.read.schema(schema).csv("floats.csv")
> df: org.apache.spark.sql.DataFrame = [floats: float, more_floats: float]
> scala> df.filter("floats is null").show
> +------+-----------+
> |floats|more_floats|
> +------+-----------+
> | null| null|
> | null| null|
> | null| null|
> | null| null|
> | null| 4.0|
> | null| null|
> | null| 6.0|
> +------+-----------+
> {noformat}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org