You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dongjoon Hyun (Jira)" <ji...@apache.org> on 2020/09/08 03:46:00 UTC

[jira] [Resolved] (SPARK-32764) compare of -0.0 < 0.0 return true

     [ https://issues.apache.org/jira/browse/SPARK-32764?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Dongjoon Hyun resolved SPARK-32764.
-----------------------------------
    Fix Version/s: 3.1.0
                   3.0.1
       Resolution: Fixed

Issue resolved by pull request 29647
[https://github.com/apache/spark/pull/29647]

> compare of -0.0 < 0.0 return true
> ---------------------------------
>
>                 Key: SPARK-32764
>                 URL: https://issues.apache.org/jira/browse/SPARK-32764
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 3.0.0, 3.0.1
>            Reporter: Izek Greenfield
>            Assignee: Wenchen Fan
>            Priority: Major
>              Labels: correctness
>             Fix For: 3.0.1, 3.1.0
>
>         Attachments: 2.4_codegen.txt, 3.0_Codegen.txt
>
>
> {code:scala}
>  val spark: SparkSession = SparkSession
>       .builder()
>       .master("local")
>       .appName("SparkByExamples.com")
>       .getOrCreate()
>     spark.sparkContext.setLogLevel("ERROR")
>     import spark.sqlContext.implicits._
>     val df = Seq((-0.0, 0.0)).toDF("neg", "pos")
>       .withColumn("comp", col("neg") < col("pos"))
>       df.show(false)
> ======
> +----+---+----+
> |neg |pos|comp|
> +----+---+----+
> |-0.0|0.0|true|
> +----+---+----+{code}
> I think that result should be false.
> **Apache Spark 2.4.6 RESULT**
> {code}
> scala> spark.version
> res0: String = 2.4.6
> scala> Seq((-0.0, 0.0)).toDF("neg", "pos").withColumn("comp", col("neg") < col("pos")).show
> +----+---+-----+
> | neg|pos| comp|
> +----+---+-----+
> |-0.0|0.0|false|
> +----+---+-----+
> {code}



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org