You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "salamani (JIRA)" <ji...@apache.org> on 2019/02/25 07:44:00 UTC

[jira] [Created] (SPARK-26983) Spark PassThroughSuite failure on bigendian

salamani created SPARK-26983:
--------------------------------

             Summary: Spark PassThroughSuite failure on bigendian
                 Key: SPARK-26983
                 URL: https://issues.apache.org/jira/browse/SPARK-26983
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.3.2
            Reporter: salamani
             Fix For: 2.3.2


Following failures are observed for PassThroughSuite in Spark Project SQL  

```
 - PassThrough with FLOAT: empty column for decompress()
 - PassThrough with FLOAT: long random series for decompress() *** FAILED ***
 Expected 0.10990685, but got -6.6357654E14 Wrong 0-th decoded float value (PassThroughEncodingSuite.scala:146)
 - PassThrough with FLOAT: simple case with null for decompress() *** FAILED ***
 Expected 2.0, but got 9.0E-44 Wrong 0-th decoded float value (PassThroughEncodingSuite.scala:146)
 - PassThrough with DOUBLE: empty column
 - PassThrough with DOUBLE: long random series
 - PassThrough with DOUBLE: empty column for decompress()
 - PassThrough with DOUBLE: long random series for decompress() *** FAILED ***
 Expected 0.20634564007984624, but got 5.902392643940031E-230 Wrong 0-th decoded double value (PassThroughEncodingSuite.scala:150)
 - PassThrough with DOUBLE: simple case with null for decompress() *** FAILED ***
 Expected 2.0, but got 3.16E-322 Wrong 0-th decoded double value (PassThroughEncodingSuite.scala:150)
 Run completed in 9 seconds, 72 milliseconds.
 Total number of tests run: 30
 Suites: completed 2, aborted 0
 Tests: succeeded 26, failed 4, canceled 0, ignored 0, pending 0
 ** 
 *** 4 TESTS FAILED ***
 ```



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org