You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/05/23 03:29:00 UTC

[jira] [Updated] (SPARK-39252) Flaky Test: pyspark.sql.tests.test_dataframe.DataFrameTests test_df_is_empty

     [ https://issues.apache.org/jira/browse/SPARK-39252?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Hyukjin Kwon updated SPARK-39252:
---------------------------------
    Description: 
{{test_df_is_empty}} is flaky. For example, a recent PR: https://github.com/apache/spark/pull/36580
https://github.com/panbingkun/spark/runs/6525997469?check_suite_focus=true

Possibly introduced from SPARK-39084

{code}
test_df_is_empty (pyspark.sql.tests.test_dataframe.DataFrameTests) ... 
[Stage 6:>                                                          (0 + 1) / 1]

                                                                                
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007fd84a1486ff, pid=4021, tid=0x00007fd8016a2700
#
# JRE version: OpenJDK Runtime Environment (Zulu 8.62.0.19-CA-linux64) (8.0_332-b09) (build 1.8.0_332-b09)
# Java VM: OpenJDK 64-Bit Server VM (25.332-b09 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# J 9116 C2 org.apache.spark.unsafe.UnsafeAlignedOffset.getSize(Ljava/lang/Object;J)I (51 bytes) @ 0x00007fd84a1486ff [0x00007fd84a1486e0+0x1f]
#
# Core dump written. Default location: /__w/spark/spark/core or core.4021
#
# An error report file with more information is saved as:
# /__w/spark/spark/hs_err_pid4021.log
#
# If you would like to submit a bug report, please visit:
#   http://www.azul.com/support/
#
----------------------------------------
Exception happened during processing of request from ('127.0.0.1', 36358)
Traceback (most recent call last):
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 316, in _handle_request_noblock
    self.process_request(request, client_address)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 347, in process_request
    self.finish_request(request, client_address)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 360, in finish_request
    self.RequestHandlerClass(request, client_address, self)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 720, in __init__
    self.handle()
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 281, in handle
    poll(accum_updates)
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 253, in poll
    if func():
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 257, in accum_updates
    num_updates = read_int(self.rfile)
  File "/__w/spark/spark/python/pyspark/serializers.py", line 595, in read_int
    raise EOFError
{code}

  was:
{{test_df_is_empty}} is flaky. For example, a recent PR: https://github.com/apache/spark/pull/36580
https://github.com/panbingkun/spark/runs/6525997469?check_suite_focus=true

Possibly introduced from SPARK-33277

{code}
test_df_is_empty (pyspark.sql.tests.test_dataframe.DataFrameTests) ... 
[Stage 6:>                                                          (0 + 1) / 1]

                                                                                
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007fd84a1486ff, pid=4021, tid=0x00007fd8016a2700
#
# JRE version: OpenJDK Runtime Environment (Zulu 8.62.0.19-CA-linux64) (8.0_332-b09) (build 1.8.0_332-b09)
# Java VM: OpenJDK 64-Bit Server VM (25.332-b09 mixed mode linux-amd64 compressed oops)
# Problematic frame:
# J 9116 C2 org.apache.spark.unsafe.UnsafeAlignedOffset.getSize(Ljava/lang/Object;J)I (51 bytes) @ 0x00007fd84a1486ff [0x00007fd84a1486e0+0x1f]
#
# Core dump written. Default location: /__w/spark/spark/core or core.4021
#
# An error report file with more information is saved as:
# /__w/spark/spark/hs_err_pid4021.log
#
# If you would like to submit a bug report, please visit:
#   http://www.azul.com/support/
#
----------------------------------------
Exception happened during processing of request from ('127.0.0.1', 36358)
Traceback (most recent call last):
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 316, in _handle_request_noblock
    self.process_request(request, client_address)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 347, in process_request
    self.finish_request(request, client_address)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 360, in finish_request
    self.RequestHandlerClass(request, client_address, self)
  File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 720, in __init__
    self.handle()
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 281, in handle
    poll(accum_updates)
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 253, in poll
    if func():
  File "/__w/spark/spark/python/pyspark/accumulators.py", line 257, in accum_updates
    num_updates = read_int(self.rfile)
  File "/__w/spark/spark/python/pyspark/serializers.py", line 595, in read_int
    raise EOFError
{code}


> Flaky Test: pyspark.sql.tests.test_dataframe.DataFrameTests test_df_is_empty
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-39252
>                 URL: https://issues.apache.org/jira/browse/SPARK-39252
>             Project: Spark
>          Issue Type: Test
>          Components: PySpark
>    Affects Versions: 3.3.0
>            Reporter: Hyukjin Kwon
>            Priority: Major
>
> {{test_df_is_empty}} is flaky. For example, a recent PR: https://github.com/apache/spark/pull/36580
> https://github.com/panbingkun/spark/runs/6525997469?check_suite_focus=true
> Possibly introduced from SPARK-39084
> {code}
> test_df_is_empty (pyspark.sql.tests.test_dataframe.DataFrameTests) ... 
> [Stage 6:>                                                          (0 + 1) / 1]
>                                                                                 
> #
> # A fatal error has been detected by the Java Runtime Environment:
> #
> #  SIGSEGV (0xb) at pc=0x00007fd84a1486ff, pid=4021, tid=0x00007fd8016a2700
> #
> # JRE version: OpenJDK Runtime Environment (Zulu 8.62.0.19-CA-linux64) (8.0_332-b09) (build 1.8.0_332-b09)
> # Java VM: OpenJDK 64-Bit Server VM (25.332-b09 mixed mode linux-amd64 compressed oops)
> # Problematic frame:
> # J 9116 C2 org.apache.spark.unsafe.UnsafeAlignedOffset.getSize(Ljava/lang/Object;J)I (51 bytes) @ 0x00007fd84a1486ff [0x00007fd84a1486e0+0x1f]
> #
> # Core dump written. Default location: /__w/spark/spark/core or core.4021
> #
> # An error report file with more information is saved as:
> # /__w/spark/spark/hs_err_pid4021.log
> #
> # If you would like to submit a bug report, please visit:
> #   http://www.azul.com/support/
> #
> ----------------------------------------
> Exception happened during processing of request from ('127.0.0.1', 36358)
> Traceback (most recent call last):
>   File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 316, in _handle_request_noblock
>     self.process_request(request, client_address)
>   File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 347, in process_request
>     self.finish_request(request, client_address)
>   File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 360, in finish_request
>     self.RequestHandlerClass(request, client_address, self)
>   File "/usr/lib/pypy3/lib-python/3/socketserver.py", line 720, in __init__
>     self.handle()
>   File "/__w/spark/spark/python/pyspark/accumulators.py", line 281, in handle
>     poll(accum_updates)
>   File "/__w/spark/spark/python/pyspark/accumulators.py", line 253, in poll
>     if func():
>   File "/__w/spark/spark/python/pyspark/accumulators.py", line 257, in accum_updates
>     num_updates = read_int(self.rfile)
>   File "/__w/spark/spark/python/pyspark/serializers.py", line 595, in read_int
>     raise EOFError
> {code}



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org