You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gengliang Wang (Jira)" <ji...@apache.org> on 2021/09/28 10:40:00 UTC

[jira] [Resolved] (SPARK-36836) "sha2" expression with bit_length of 224 returns incorrect results

     [ https://issues.apache.org/jira/browse/SPARK-36836?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Gengliang Wang resolved SPARK-36836.
------------------------------------
    Fix Version/s: 3.2.0
       Resolution: Fixed

Issue resolved by pull request 34086
[https://github.com/apache/spark/pull/34086]

> "sha2" expression with bit_length of 224 returns incorrect results
> ------------------------------------------------------------------
>
>                 Key: SPARK-36836
>                 URL: https://issues.apache.org/jira/browse/SPARK-36836
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.0, 3.0.0, 3.1.0, 3.2.0
>            Reporter: Richard Chen
>            Priority: Major
>             Fix For: 3.2.0
>
>
> {{sha2(input, bit_length)}} returns incorrect results when {{bit_length == 224}}.
>  
> This bug seems to have been present since the {{sha2}} expression was introduced in 1.5.0.
>  
> Repro in spark shell:
> {{spark.sql("SELECT sha2('abc', 224)").show()}}
>  
> Spark currently returns a garbled string, consisting of invalid UTF:
>  {{#\t}"4�"�B�w��U�*��你���l��}}
> The expected return value is: 
> {{23097d223405d8228642a477bda255b32aadbce4bda0b3f7e36c9da7}}
>  
> This appears to happen because the  {{MessageDigest.digest()}} function appears to return bytes intended to be interpreted as a {{BigInt}} rather than a string. Thus, the output of {{MessageDigest.digest()}} must first be interpreted as a {{BigInt}} and then transformed into a hex string. 



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org