You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Gabor Somogyi (Jira)" <ji...@apache.org> on 2020/10/29 13:16:00 UTC
[jira] [Commented] (SPARK-33273) Fix Flaky Test:
ThriftServerQueryTestSuite.
subquery_scalar_subquery_scalar_subquery_select_sql
[ https://issues.apache.org/jira/browse/SPARK-33273?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17222896#comment-17222896 ]
Gabor Somogyi commented on SPARK-33273:
---------------------------------------
I've just faced with this too: https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/130409/testReport/org.apache.spark.sql.hive.thriftserver/ThriftServerQueryTestSuite/subquery_scalar_subquery_scalar_subquery_select_sql/
> Fix Flaky Test: ThriftServerQueryTestSuite. subquery_scalar_subquery_scalar_subquery_select_sql
> -----------------------------------------------------------------------------------------------
>
> Key: SPARK-33273
> URL: https://issues.apache.org/jira/browse/SPARK-33273
> Project: Spark
> Issue Type: Test
> Components: Tests
> Affects Versions: 3.1.0
> Reporter: Dongjoon Hyun
> Priority: Major
>
> - https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/130369/testReport/org.apache.spark.sql.hive.thriftserver/ThriftServerQueryTestSuite/subquery_scalar_subquery_scalar_subquery_select_sql/
> {code}
> [info] - subquery/scalar-subquery/scalar-subquery-select.sql *** FAILED *** (3 seconds, 877 milliseconds)
> [info] Expected "[1]0 2017-05-04 01:01:0...", but got "[]0 2017-05-04 01:01:0..." Result did not match for query #3
> [info] SELECT (SELECT min(t3d) FROM t3) min_t3d,
> [info] (SELECT max(t2h) FROM t2) max_t2h
> [info] FROM t1
> [info] WHERE t1a = 'val1c' (ThriftServerQueryTestSuite.scala:197)
> {code}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org