You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Kazuaki Ishizaki (JIRA)" <ji...@apache.org> on 2018/02/25 16:18:00 UTC

[jira] [Commented] (SPARK-23498) Accuracy problem in comparison with string and integer

    [ https://issues.apache.org/jira/browse/SPARK-23498?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16376137#comment-16376137 ] 

Kazuaki Ishizaki commented on SPARK-23498:
------------------------------------------

SPARK implicitly casts {{String}} to type of another operand of a binary operator. I think that this rule is the same as in [rule|https://www.postgresql.org/docs/10/static/typeconv-oper.html] in PostgreSQL.

The following code would work as you expected.
{code}
select '1000.1'>1000.0
{code}

cc: [~LI,Xiao]

> Accuracy problem in comparison with string and integer
> ------------------------------------------------------
>
>                 Key: SPARK-23498
>                 URL: https://issues.apache.org/jira/browse/SPARK-23498
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.2.0, 2.2.1, 2.3.0
>            Reporter: Kevin Zhang
>            Priority: Major
>
> While comparing a string column with integer value, spark sql will automatically cast the string operant to int, the following sql will return true in hive but false in spark
>  
> {code:java}
> select '1000.1'>1000
> {code}
>  
>  from the physical plan we can see the string operant was cast to int which caused the accuracy loss
> {code:java}
> *Project [false AS (CAST(1000.1 AS INT) > 1000)#4]
> +- Scan OneRowRelation[]
> {code}
> To solve it, using a wider common type like double to cast both sides of operant of a binary operator may be safe.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org