You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2021/05/03 10:35:00 UTC

[jira] [Commented] (SPARK-35265) abs return negative

    [ https://issues.apache.org/jira/browse/SPARK-35265?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17338298#comment-17338298 ] 

Hyukjin Kwon commented on SPARK-35265:
--------------------------------------

This is fixed via {{spark.sql.ansi.enabled}} configuration:

{code}
: java.lang.ArithmeticException: integer overflow
	at java.lang.Math.negateExact(Math.java:977)
	at org.apache.spark.sql.types.IntegerExactNumeric$.negate(numerics.scala:102)
	at org.apache.spark.sql.types.IntegerExactNumeric$.negate(numerics.scala:95)
	at scala.math.Numeric.abs(Numeric.scala:212)
	at scala.math.Numeric.abs$(Numeric.scala:212)
{code}

It's disabled by default to keep the legacy behaviour and don't break user's apps.

> abs return negative
> -------------------
>
>                 Key: SPARK-35265
>                 URL: https://issues.apache.org/jira/browse/SPARK-35265
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 3.0.1, 3.1.1
>            Reporter: liuzhenjie
>            Priority: Major
>
> from pyspark.sql.functions import lit, abs, concat, hash,col
> df = df.withColumn('partition_id', lit(-2147483648))
>  df = df.withColumn('abs_id', abs(col('partition_id')))
>  df.select('abs_id','partition_id').show()
>  
> when the number is  -2147483648,method abs return negative 
>  +-----------+------------+
> | abs_id        |partition_id |
> +-----------+------------+
> |-2147483648| -2147483648|
> |-2147483648| -2147483648|
> |-2147483648| -2147483648|
> |-2147483648| -2147483648|
> |-2147483648| -2147483648|
> +-----------+------------+
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org