You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Richard Siebeling <rs...@gmail.com> on 2014/01/20 13:29:47 UTC

Loss was due to java.lang.ClassNotFoundException java.lang.ClassNotFoundException: scala.None$ error when mysql-async is add in build.sbt

My application is failing with an "Loss was due to
java.lang.ClassNotFoundException java.lang.ClassNotFoundException:
scala.None$" error when the mysql-async library (
https://github.com/mauricio/postgresql-async) is added to build.sbt.

I've add the following line to build.sbt "com.github.mauricio" %%
"mysql-async" % "0.2.11"

When this line is commented out the application runs just fine.

Could you please help? I'm a newbie with Scala and Spark but would like to
create an async connection to mysql to import my datadefinitions (i.e.
which datasets there are, where to find them in HDFS, etc.) in order to
create dynamic RDD's based on definitions in mysql.

I'm getting the following error message:

23:43:54.429 [spark-akka.actor.default-dispatcher-3] INFO
o.a.s.s.local.LocalTaskSetManager - Loss was due to
java.lang.ClassNotFoundException
java.lang.ClassNotFoundException: scala.None$
at java.net.URLClassLoader$1.run(Unknown Source)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at
org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:36)
at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
at java.io.ObjectInputStream.readClassDesc(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.defaultReadFields(Unknown Source)
at java.io.ObjectInputStream.readSerialData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.readObject(Unknown Source)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
at
org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:129)
at java.io.ObjectInputStream.readExternalData(Unknown Source)
at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
at java.io.ObjectInputStream.readObject0(Unknown Source)
at java.io.ObjectInputStream.readObject(Unknown Source)
at
org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
at
org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:61)
at
org.apache.spark.scheduler.local.LocalScheduler.runTask(LocalScheduler.scala:191)
at
org.apache.spark.scheduler.local.LocalActor$$anonfun$launchTask$1$$anon$1.run(LocalScheduler.scala:68)
at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
at java.util.concurrent.FutureTask.run(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
at java.lang.Thread.run(Unknown Source)
23:43:54.438 [DAGScheduler] DEBUG o.a.spark.scheduler.DAGScheduler - Got
event of type org.apache.spark.scheduler.TaskSetFailed
23:43:54.443 [test-akka.actor.default-dispatcher-3] INFO
o.a.spark.scheduler.DAGScheduler - Failed to run count at
DataSession.scala:26
23:43:54.447 [spark-akka.actor.default-dispatcher-3] INFO
o.a.s.scheduler.local.LocalScheduler - Remove TaskSet 0.0 from pool
[ERROR] [01/19/2014 23:43:54.455] [test-akka.actor.default-dispatcher-6]
[akka://test/user/testServer/1/771192171] Job failed: Task 0.0:0 failed
more than 4 times; aborting job java.lang.ClassNotFoundException:
scala.None$
org.apache.spark.SparkException: Job failed: Task 0.0:0 failed more than 4
times; aborting job java.lang.ClassNotFoundException: scala.None$
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:761)
at
org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:759)
at
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
at
org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:759)
at
org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:380)
at org.apache.spark.scheduler.DAGScheduler.org
$apache$spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:442)
at
org.apache.spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:150)

Re: Loss was due to java.lang.ClassNotFoundException java.lang.ClassNotFoundException: scala.None$ error when mysql-async is add in build.sbt

Posted by Richard Siebeling <rs...@gmail.com>.
Solved, the mysql-async required scala 2.10.3 and I was compiling was
version 2.10.2


On Mon, Jan 20, 2014 at 1:29 PM, Richard Siebeling <rs...@gmail.com>wrote:

>  My application is failing with an "Loss was due to
> java.lang.ClassNotFoundException java.lang.ClassNotFoundException:
> scala.None$" error when the mysql-async library (
> https://github.com/mauricio/postgresql-async) is added to build.sbt.
>
> I've add the following line to build.sbt "com.github.mauricio" %%
> "mysql-async" % "0.2.11"
>
> When this line is commented out the application runs just fine.
>
> Could you please help? I'm a newbie with Scala and Spark but would like to
> create an async connection to mysql to import my datadefinitions (i.e.
> which datasets there are, where to find them in HDFS, etc.) in order to
> create dynamic RDD's based on definitions in mysql.
>
> I'm getting the following error message:
>
> 23:43:54.429 [spark-akka.actor.default-dispatcher-3] INFO
> o.a.s.s.local.LocalTaskSetManager - Loss was due to
> java.lang.ClassNotFoundException
> java.lang.ClassNotFoundException: scala.None$
> at java.net.URLClassLoader$1.run(Unknown Source)
> at java.security.AccessController.doPrivileged(Native Method)
> at java.net.URLClassLoader.findClass(Unknown Source)
> at java.lang.ClassLoader.loadClass(Unknown Source)
> at java.lang.ClassLoader.loadClass(Unknown Source)
> at java.lang.Class.forName0(Native Method)
> at java.lang.Class.forName(Unknown Source)
> at
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:36)
> at java.io.ObjectInputStream.readNonProxyDesc(Unknown Source)
> at java.io.ObjectInputStream.readClassDesc(Unknown Source)
> at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
> at java.io.ObjectInputStream.readObject0(Unknown Source)
> at java.io.ObjectInputStream.defaultReadFields(Unknown Source)
> at java.io.ObjectInputStream.readSerialData(Unknown Source)
> at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
> at java.io.ObjectInputStream.readObject0(Unknown Source)
> at java.io.ObjectInputStream.readObject(Unknown Source)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
> at
> org.apache.spark.scheduler.ResultTask$.deserializeInfo(ResultTask.scala:61)
> at org.apache.spark.scheduler.ResultTask.readExternal(ResultTask.scala:129)
> at java.io.ObjectInputStream.readExternalData(Unknown Source)
> at java.io.ObjectInputStream.readOrdinaryObject(Unknown Source)
> at java.io.ObjectInputStream.readObject0(Unknown Source)
> at java.io.ObjectInputStream.readObject(Unknown Source)
> at
> org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:39)
> at
> org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:61)
> at
> org.apache.spark.scheduler.local.LocalScheduler.runTask(LocalScheduler.scala:191)
> at
> org.apache.spark.scheduler.local.LocalActor$$anonfun$launchTask$1$$anon$1.run(LocalScheduler.scala:68)
> at java.util.concurrent.Executors$RunnableAdapter.call(Unknown Source)
> at java.util.concurrent.FutureTask$Sync.innerRun(Unknown Source)
> at java.util.concurrent.FutureTask.run(Unknown Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(Unknown Source)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)
> at java.lang.Thread.run(Unknown Source)
> 23:43:54.438 [DAGScheduler] DEBUG o.a.spark.scheduler.DAGScheduler - Got
> event of type org.apache.spark.scheduler.TaskSetFailed
> 23:43:54.443 [test-akka.actor.default-dispatcher-3] INFO
> o.a.spark.scheduler.DAGScheduler - Failed to run count at
> DataSession.scala:26
> 23:43:54.447 [spark-akka.actor.default-dispatcher-3] INFO
> o.a.s.scheduler.local.LocalScheduler - Remove TaskSet 0.0 from pool
> [ERROR] [01/19/2014 23:43:54.455] [test-akka.actor.default-dispatcher-6]
> [akka://test/user/testServer/1/771192171] Job failed: Task 0.0:0 failed
> more than 4 times; aborting job java.lang.ClassNotFoundException:
> scala.None$
> org.apache.spark.SparkException: Job failed: Task 0.0:0 failed more than 4
> times; aborting job java.lang.ClassNotFoundException: scala.None$
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:761)
> at
> org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:759)
> at
> scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
> at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
> at
> org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:759)
> at
> org.apache.spark.scheduler.DAGScheduler.processEvent(DAGScheduler.scala:380)
> at org.apache.spark.scheduler.DAGScheduler.org
> $apache$spark$scheduler$DAGScheduler$$run(DAGScheduler.scala:442)
> at
> org.apache.spark.scheduler.DAGScheduler$$anon$1.run(DAGScheduler.scala:150)
>