You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Simrit Kaur (Jira)" <ji...@apache.org> on 2021/05/25 16:25:00 UTC

[jira] [Created] (SPARK-35520) Spark-SQL test fails on IBM Z for certain config combinations.

Simrit Kaur created SPARK-35520:
-----------------------------------

             Summary: Spark-SQL test fails on IBM Z for certain config combinations.
                 Key: SPARK-35520
                 URL: https://issues.apache.org/jira/browse/SPARK-35520
             Project: Spark
          Issue Type: Bug
          Components: Spark Core, SQL
    Affects Versions: 3.1.1
            Reporter: Simrit Kaur


Some queries of SQL related test cases: in-joins.sql, in-order-by.sql, not-in-group-by.sql and SubquerySuite.scala are failing with specific configuration combinations on IBM Z(s390x).

For example: 

sql("select * from l where a = 6 and a not in (select c from r where c is not null)") query from SubquerySuite.scala fails for following config combinations:
|enableNAAJ|enableAQE|enableCodegen|
|TRUE|FALSE|FALSE|
|TRUE|TRUE|FALSE|

The above combination is also causing 2 other queries in in-joins.sql and in-order-by.sql failing.

Another query: 

SELECT Count(*)
FROM (SELECT *
 FROM t2
 WHERE t2a NOT IN (SELECT t3a
 FROM t3
 WHERE t3h != t2h)) t2
WHERE t2b NOT IN (SELECT Min(t2b)
 FROM t2
 WHERE t2b = t2b
 GROUP BY t2c);

from not-in-group-by.sql is failing for following combinations:
|enableAQE|enableCodegen|
|FALSE|TRUE|
|FALSE|FALSE|

 

These Test cases are not failing for 3.0.1 release and I believe might have been introduced with [#SPARK-32290] . 

There is another strange behaviour observed, if expected output is 1,3 , I am getting 1, 3, 9. If I update the Golden file to expect 1, 3, 9, the output will be 1, 3.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org