You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Apache Spark (JIRA)" <ji...@apache.org> on 2019/06/07 05:53:00 UTC
[jira] [Assigned] (SPARK-26985) Test "access only some column of
the all of columns " fails on big endian
[ https://issues.apache.org/jira/browse/SPARK-26985?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26985:
------------------------------------
Assignee: Apache Spark
> Test "access only some column of the all of columns " fails on big endian
> -------------------------------------------------------------------------
>
> Key: SPARK-26985
> URL: https://issues.apache.org/jira/browse/SPARK-26985
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.3.2
> Environment: Linux Ubuntu 16.04
> openjdk version "1.8.0_202"
> OpenJDK Runtime Environment (build 1.8.0_202-b08)
> Eclipse OpenJ9 VM (build openj9-0.12.1, JRE 1.8.0 64-Bit Compressed References 20190205_218 (JIT enabled, AOT enabled)
> OpenJ9 - 90dd8cb40
> OMR - d2f4534b
> JCL - d002501a90 based on jdk8u202-b08)
>
> Reporter: Anuja Jakhade
> Assignee: Apache Spark
> Priority: Major
> Labels: BigEndian
> Attachments: DataFrameTungstenSuite.txt, InMemoryColumnarQuerySuite.txt, access only some column of the all of columns.txt
>
>
> While running tests on Apache Spark v2.3.2 with AdoptJDK on big endian, I am observing test failures for 2 Suites of Project SQL.
> 1. InMemoryColumnarQuerySuite
> 2. DataFrameTungstenSuite
> In both the cases test "access only some column of the all of columns" fails due to mismatch in the final assert.
> Observed that the data obtained after df.cache() is causing the error. Please find attached the log with the details.
> cache() works perfectly fine if double and float values are not in picture.
> Inside test !!!!!!- access only some column of the all of columns *** FAILED ***
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org