You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrew Or (JIRA)" <ji...@apache.org> on 2015/04/28 05:32:05 UTC

[jira] [Created] (SPARK-7180) SerializationDebugger fails when attempting to serialize a FunSuite

Andrew Or created SPARK-7180:
--------------------------------

             Summary: SerializationDebugger fails when attempting to serialize a FunSuite
                 Key: SPARK-7180
                 URL: https://issues.apache.org/jira/browse/SPARK-7180
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 1.3.0
            Reporter: Andrew Or


This is most likely not specific to FunSuite, but I when I try to serialize one (don't ask why) e.g. ExecutorAllocationManagerSuite, I get an ArrayOutOfBoundsException.

I dug into this a little and found that in `SerializationDebugger#visitSerializable` incorrectly associates a class's fields with another's values. For instance, in the output I generated from adding println's everywhere:

{code}
* Visiting ExecutorAllocationManagerSuite (org.apache.spark.ExecutorAllocationManagerSuite)
*** visiting serializable object (class org.apache.spark.ExecutorAllocationManagerSuite, ExecutorAllocationManagerSuite)
  Final object = ExecutorAllocationManagerSuite + org.apache.spark.ExecutorAllocationManagerSuite
  Final object description org.apache.spark.ExecutorAllocationManagerSuite: static final long serialVersionUID = 5565470274968132811L;
  Slot descs 2
    - org.scalatest.FunSuite: static final long serialVersionUID = -5883370421614863475L;
      > fields = 4
        >> Lorg/scalatest/Suite$NoArgTest$; NoArgTest$module
        >> Lorg/scalatest/Assertions$AssertionsHelper; assertionsHelper
        >> Lorg/scalatest/Engine; org$scalatest$FunSuiteLike$$engine
        >> Ljava/lang/String; styleName
      > numObjFields = 4
    - org.apache.spark.ExecutorAllocationManagerSuite: static final long serialVersionUID = 5565470274968132811L;
      > fields = 5
        >> Z invokeBeforeAllAndAfterAllEvenIfNoTestsAreExpected
        >> Z org$scalatest$BeforeAndAfter$$runHasBeenInvoked
        >> Lscala/collection/mutable/ListBuffer; org$apache$spark$ExecutorAllocationManagerSuite$$contexts
        >> Ljava/util/concurrent/atomic/AtomicReference; org$scalatest$BeforeAndAfter$$afterFunctionAtomic
        >> Ljava/util/concurrent/atomic/AtomicReference; org$scalatest$BeforeAndAfter$$beforeFunctionAtomic
      > numObjFields = 3
{code}

We can see that the ExecutorAllocationManagerSuite has two class data slot descriptions. The first one refers to fields that belong to FunSuite, and the second fields that belong to ExecutorAllocationManagerSuite.

Later, however, when we are looking at the fields that belong to FunSuite, we try to assign the values of ExecutorAllocationManagerSuite's fields to them. This is because the object we run reflection on is type "ExecutorAllocationManagerSuite", and the mismatch in field length causes the ArrayOutOfBoundExceptions.

The offending line is: https://github.com/apache/spark/blob/4d9e560b5470029143926827b1cb9d72a0bfbeff/core/src/main/scala/org/apache/spark/serializer/SerializationDebugger.scala#L150



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org