You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean R. Owen (Jira)" <ji...@apache.org> on 2019/10/26 23:51:00 UTC
[jira] [Commented] (SPARK-25907) SIGBUS (0xa) when using
DataFrameWriter.insertInto
[ https://issues.apache.org/jira/browse/SPARK-25907?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16960473#comment-16960473 ]
Sean R. Owen commented on SPARK-25907:
--------------------------------------
... can you show the reproduction?
> SIGBUS (0xa) when using DataFrameWriter.insertInto
> --------------------------------------------------
>
> Key: SPARK-25907
> URL: https://issues.apache.org/jira/browse/SPARK-25907
> Project: Spark
> Issue Type: Bug
> Components: Java API
> Affects Versions: 2.3.2
> Reporter: Alexander Zautke
> Priority: Major
> Attachments: DiagnosticReport.txt, hs_err_pid18703.log
>
>
> Hi everyone!
> I am currently running into the issue that a call to
> DataFrameWriter.insertInto is reproducibly crashing the JVM.
> #
> # A fatal error has been detected by the Java Runtime Environment:
> #
> # SIGBUS (0xa) at pc=3D0x00000001194a3520, pid=3D16154,
> tid=3D0x0000000000008417
> #
> # JRE version: Java(TM) SE Runtime Environment (8.0_121-b13) (build
> 1.8.0_121-b13)
> # Java VM: Java HotSpot(TM) 64-Bit Server VM (25.121-b13 mixed mode
> bsd-amd64 compressed oops)
> # Problematic frame:
> # v ~StubRoutines::jshort_disjoint_arraycopy
> #
> # Failed to write core dump. Core dumps have been disabled. To enable
> core dumping, try "ulimit -c unlimited" before starting Java again
> #
> # If you would like to submit a bug report, please visit:
> # [http://bugreport.java.com/bugreport/crash.jsp]
> #
> The last call before the crash is made to =
> org.apache.spark.unsafe.types.UTF8String.getBytes().
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org