You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2016/01/28 03:52:39 UTC

[jira] [Created] (SPARK-13055) SQLHistoryListener throws ClassCastException

Andrew Or created SPARK-13055:
---------------------------------

             Summary: SQLHistoryListener throws ClassCastException
                 Key: SPARK-13055
                 URL: https://issues.apache.org/jira/browse/SPARK-13055
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.5.0
            Reporter: Andrew Or
            Assignee: Andrew Or


{code}
16/01/27 18:46:28 ERROR ReplayListenerBus: Listener SQLHistoryListener threw an exception
java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long
        at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:110)
        at org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1$$anonfun$5.apply(SQLListener.scala:334)
        at org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1$$anonfun$5.apply(SQLListener.scala:334)
        at scala.Option.map(Option.scala:145)
        at org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:334)
        at org.apache.spark.sql.execution.ui.SQLHistoryListener$$anonfun$onTaskEnd$1.apply(SQLListener.scala:332)
{code}

SQLHistoryListener listens on SparkListenerTaskEnd events, which contain non-SQL accumulators as well. We try to cast all accumulators we encounter to Long, resulting in an error like this one.

Note: this was a problem even before internal accumulators were introduced. If  the task used a user accumulator of a type other than Long, we would still see this.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org