You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Walrus theCat <wa...@gmail.com> on 2014/03/18 00:17:53 UTC

inexplicable exceptions in Spark 0.7.3

Hi,

I'm getting this stack trace, using Spark 0.7.3.  No references to anything
in my code, never experienced anything like this before.  Any ideas what is
going on?

java.lang.ClassCastException: spark.SparkContext$$anonfun$9 cannot be cast
to scala.Function2
    at spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:43)
    at spark.scheduler.ResultTask.readExternal(ResultTask.scala:106)
    at
java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
    at
java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
    at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
    at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
    at spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
    at spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
    at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
    at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
    at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
    at java.lang.Thread.run(Thread.java:744)

Re: inexplicable exceptions in Spark 0.7.3

Posted by Walrus theCat <wa...@gmail.com>.
Hi Andrew,

Thanks for your interest.  This is a standalone job.


On Mon, Mar 17, 2014 at 4:30 PM, Andrew Ash <an...@andrewash.com> wrote:

> Are you running from the spark shell or from a standalone job?
>
>
> On Mon, Mar 17, 2014 at 4:17 PM, Walrus theCat <wa...@gmail.com>wrote:
>
>> Hi,
>>
>> I'm getting this stack trace, using Spark 0.7.3.  No references to
>> anything in my code, never experienced anything like this before.  Any
>> ideas what is going on?
>>
>> java.lang.ClassCastException: spark.SparkContext$$anonfun$9 cannot be
>> cast to scala.Function2
>>     at spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:43)
>>     at spark.scheduler.ResultTask.readExternal(ResultTask.scala:106)
>>     at
>> java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
>>     at
>> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>>     at spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
>>     at spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
>>     at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
>>     at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>     at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>     at java.lang.Thread.run(Thread.java:744)
>>
>
>

Re: inexplicable exceptions in Spark 0.7.3

Posted by Andrew Ash <an...@andrewash.com>.
Are you running from the spark shell or from a standalone job?


On Mon, Mar 17, 2014 at 4:17 PM, Walrus theCat <wa...@gmail.com>wrote:

> Hi,
>
> I'm getting this stack trace, using Spark 0.7.3.  No references to
> anything in my code, never experienced anything like this before.  Any
> ideas what is going on?
>
> java.lang.ClassCastException: spark.SparkContext$$anonfun$9 cannot be cast
> to scala.Function2
>     at spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:43)
>     at spark.scheduler.ResultTask.readExternal(ResultTask.scala:106)
>     at
> java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:1837)
>     at
> java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1796)
>     at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1350)
>     at java.io.ObjectInputStream.readObject(ObjectInputStream.java:370)
>     at spark.JavaDeserializationStream.readObject(JavaSerializer.scala:23)
>     at spark.JavaSerializerInstance.deserialize(JavaSerializer.scala:45)
>     at spark.executor.Executor$TaskRunner.run(Executor.scala:96)
>     at
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>     at
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>     at java.lang.Thread.run(Thread.java:744)
>