You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (Jira)" <ji...@apache.org> on 2020/08/05 11:23:00 UTC

[jira] [Assigned] (SPARK-32485) RecordBinaryComparatorSuite test failures on big-endian systems

     [ https://issues.apache.org/jira/browse/SPARK-32485?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Apache Spark reassigned SPARK-32485:
------------------------------------

    Assignee: Apache Spark

> RecordBinaryComparatorSuite test failures on big-endian systems
> ---------------------------------------------------------------
>
>                 Key: SPARK-32485
>                 URL: https://issues.apache.org/jira/browse/SPARK-32485
>             Project: Spark
>          Issue Type: Bug
>          Components: Tests
>    Affects Versions: 3.0.0
>            Reporter: Michael Munday
>            Assignee: Apache Spark
>            Priority: Minor
>              Labels: endianness
>
> The fix for SPARK-29918 broke two tests on big-endian systems:
>  * testBinaryComparatorWhenSubtractionIsDivisibleByMaxIntValue
>  * testBinaryComparatorWhenSubtractionCanOverflowLongValue
> These tests date from a time where subtraction was being used to do multi-byte comparisons. They try to trigger old bugs by feeding specific values into the comparison. However the fix for SPARK-29918 modified the order in which bytes are compared when comparing 8 bytes at a time on little-endian systems (to match the normal byte-by-byte comparison). This fix did not affect big-endian systems. However the expected output of the tests was modified for all systems regardless of endianness. So the tests broke on big-endian systems.
> It is also not clear that the values compared in the tests match the original intent of the tests now that the bytes in those values are compared in order (equivalent to the bytes in the values being reversed).



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org