You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Rohit Mishra (Jira)" <ji...@apache.org> on 2020/08/18 04:40:00 UTC

[jira] [Commented] (SPARK-32636) AsyncEventQueue: Exception scala.Some cannot be cast to java.lang.String

    [ https://issues.apache.org/jira/browse/SPARK-32636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17179354#comment-17179354 ] 

Rohit Mishra commented on SPARK-32636:
--------------------------------------

[~abubakarj], Thanks for raising the bug but please refrain from adding Fix version as that field is reserved for committers. Thanks. 

>  AsyncEventQueue: Exception scala.Some cannot be cast to java.lang.String
> -------------------------------------------------------------------------
>
>                 Key: SPARK-32636
>                 URL: https://issues.apache.org/jira/browse/SPARK-32636
>             Project: Spark
>          Issue Type: Bug
>          Components: Java API
>    Affects Versions: 3.0.0
>            Reporter: Muhammad Abubakar
>            Priority: Major
>         Attachments: err.log
>
>
> Spark 3.0.0. Hadoop 3.2. Hive 2.3,7(built-in)
>  
> Java Exception occurs when try to run a memory-intensive job. Although enough resources are available on the machine. But the actual exception doesn't look like occurred because of memory issue.
> {code:java}
> java.lang.ClassCastException: scala.Some cannot be cast to java.lang.Stringjava.lang.ClassCastException: scala.Some cannot be cast to java.lang.String at org.json4s.JsonDSL.pair2jvalue(JsonDSL.scala:82) at org.json4s.JsonDSL.pair2jvalue$(JsonDSL.scala:82) at org.json4s.JsonDSL$.pair2jvalue(JsonDSL.scala:64) at org.apache.spark.util.JsonProtocol$.taskInfoToJson(JsonProtocol.scala:309) at org.apache.spark.util.JsonProtocol$.taskStartToJson(JsonProtocol.scala:131) at org.apache.spark.util.JsonProtocol$.sparkEventToJson(JsonProtocol.scala:75) at org.apache.spark.scheduler.EventLoggingListener.logEvent(EventLoggingListener.scala:97) at org.apache.spark.scheduler.EventLoggingListener.onTaskStart(EventLoggingListener.scala:114) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:41) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115) at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:99) at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)
> ## A fatal error has been detected by the Java Runtime Environment:##  SIGSEGV (0xb) at pc=0x00007f9d2ec0cc6b, pid=30234, tid=0x00007f9174a9e700## JRE version: Java(TM) SE Runtime Environment (8.0_101-b13) (build 1.8.0_101-b13)# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.101-b13 mixed mode linux-amd64 )# Problematic frame:# V  [libjvm.so+0x7c9c6b]20/08/11 23:31:38 ERROR AsyncEventQueue: Listener AppStatusListener threw an exceptionjava.lang.ClassCastException: scala.Some cannot be cast to java.lang.String at org.apache.spark.status.LiveEntityHelpers$.weakIntern(LiveEntity.scala:665) at org.apache.spark.status.LiveTask.doUpdate(LiveEntity.scala:209) at org.apache.spark.status.LiveEntity.write(LiveEntity.scala:51) at org.apache.spark.status.AppStatusListener.update(AppStatusListener.scala:1088) at org.apache.spark.status.AppStatusListener.liveUpdate(AppStatusListener.scala:1101) at org.apache.spark.status.AppStatusListener.onTaskStart(AppStatusListener.scala:512) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent(SparkListenerBus.scala:41) at org.apache.spark.scheduler.SparkListenerBus.doPostEvent$(SparkListenerBus.scala:28) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) at org.apache.spark.scheduler.AsyncEventQueue.doPostEvent(AsyncEventQueue.scala:37) at org.apache.spark.util.ListenerBus.postToAll(ListenerBus.scala:115) at org.apache.spark.util.ListenerBus.postToAll$(ListenerBus.scala:99) at org.apache.spark.scheduler.AsyncEventQueue.super$postToAll(AsyncEventQueue.scala:105) at org.apache.spark.scheduler.AsyncEventQueue.$anonfun$dispatch$1(AsyncEventQueue.scala:105) at scala.runtime.java8.JFunction0$mcJ$sp.apply(JFunction0$mcJ$sp.java:23) at scala.util.DynamicVariable.withValue(DynamicVariable.scala:62) at org.apache.spark.scheduler.AsyncEventQueue.org$apache$spark$scheduler$AsyncEventQueue$$dispatch(AsyncEventQueue.scala:100) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.$anonfun$run$1(AsyncEventQueue.scala:96) at org.apache.spark.util.Utils$.tryOrStopSparkContext(Utils.scala:1319) at org.apache.spark.scheduler.AsyncEventQueue$$anon$2.run(AsyncEventQueue.scala:96)[thread 140262972720896 also had an error]  Klass::external_name() const+0x1b## Failed to write core dump. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again## An error report file with more information is saved as:
> {code}
> See Attached err.log generated by spark [^err.log]



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org