You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Bhaskar Jyoti Ghosh (JIRA)" <ji...@apache.org> on 2016/10/04 05:52:20 UTC
[jira] [Commented] (SPARK-8494) ClassNotFoundException when running
with sbt, scala 2.10.4, spray 1.3.3
[ https://issues.apache.org/jira/browse/SPARK-8494?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15544436#comment-15544436 ]
Bhaskar Jyoti Ghosh commented on SPARK-8494:
--------------------------------------------
Hello Experts,
I am facing similar error while trying to run by Spark ML code. In my case, the error is thrown while trying to load the saved CrossValidatorModel from the filesystem.
Background: I am trying to expose a scala code as a REST service through Scalatra. The flow is to load the model, and use it to predict the class for an user input.
ERROR org.apache.spark.util.Utils - Exception encountered java.lang.ClassNotFoundException: scala.Some at java.net.URLClassLoader$1.run(URLClassLoader.java:372) ~[na:1.8.0_25] at java.net.URLClassLoader$1.run(URLClassLoader.java:361) ~[na:1.8.0_25] at java.security.AccessController.doPrivileged(Native Method) ~[na:1.8.0_25] at java.net.URLClassLoader.findClass(URLClassLoader.java:360) ~[na:1.8.0_25] at java.lang.ClassLoader.loadClass(ClassLoader.java:424) ~[na:1.8.0_25] at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ~[na:1.8.0_25] at java.lang.Class.forName0(Native Method) ~[na:1.8.0_25] at java.lang.Class.forName(Class.java:344) ~[na:1.8.0_25] at org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:67) ~[spark-core_2.11-2.0.0.jar:2.0.0]
...
I am using spark-2.0.0-bin-hadoop2.7 distribution. And the dependencies listed in my SBT file are:
val scalatraVersion = "2.3.0"
val sparkVersion = "2.0.0"
val sprayVersion = "1.3.4"
libraryDependencies ++= Seq(
"org.scalatra" %% "scalatra" % scalatraVersion,
"org.scalatra" %% "scalatra-json" % scalatraVersion,
"org.scalatra" %% "scalatra-commands" % scalatraVersion,
"org.json4s" %% "json4s-jackson" % "3.2.9",
"ch.qos.logback" % "logback-classic" % "1.1.3" ,
"org.eclipse.jetty" % "jetty-webapp" % "9.2.10.v20150310",
"com.typesafe.akka" %% "akka-actor" % "2.3.4",
"org.apache.spark" %% "spark-core" % sparkVersion exclude("org.slf4j", "slf4j-log4j12"),
"org.apache.spark" %% "spark-sql" % sparkVersion,
"org.apache.spark" %% "spark-mllib" % sparkVersion,
"io.spray" %% "spray-can" % sprayVersion,
"io.spray" %% "spray-routing" % sprayVersion
)
Is this an issue with Scala 2.0.0, or I am missing some dependency ?
Regards,
Bhaskar
> ClassNotFoundException when running with sbt, scala 2.10.4, spray 1.3.3
> -----------------------------------------------------------------------
>
> Key: SPARK-8494
> URL: https://issues.apache.org/jira/browse/SPARK-8494
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: PJ Fanning
> Attachments: spark-test-case.zip
>
>
> I found a similar issue to SPARK-1923 but with Scala 2.10.4.
> I used the Test.scala from SPARK-1923 but used the libraryDependencies from a build.sbt that I am working on.
> If I remove the spray 1.3.3 jars, the test case passes but has a ClassNotFoundException otherwise.
> I have a spark-assembly jar built using Spark 1.3.2-SNAPSHOT.
> Application:
> {code}
> import org.apache.spark.SparkConf
> import org.apache.spark.SparkContext
> object Test {
> def main(args: Array[String]): Unit = {
> val conf = new SparkConf().setMaster("local[4]").setAppName("Test")
> val sc = new SparkContext(conf)
> sc.makeRDD(1 to 1000, 10).map(x => Some(x)).count
> sc.stop()
> }
> {code}
> Exception:
> {code}
> org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:1 failed 1 times, most recent failure: Exception failure in TID 1 on host localhost: java.lang.ClassNotFoundException: scala.collection.immutable.Range
> java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> java.security.AccessController.doPrivileged(Native Method)
> java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> java.lang.Class.forName0(Native Method)
> java.lang.Class.forName(Class.java:270)
> org.apache.spark.serializer.JavaDeserializationStream$$anon$1.resolveClass(JavaSerializer.scala:60)
> java.io.ObjectInputStream.readNonProxyDesc(ObjectInputStream.java:1612)
> java.io.ObjectInputStream.readClassDesc(ObjectInputStream.java:1517)
> {code}
> {code}
> name := "spark-test-case"
> version := "1.0"
> scalaVersion := "2.10.4"
> resolvers += "spray repo" at "http://repo.spray.io"
> resolvers += "Scalaz Bintray Repo" at "https://dl.bintray.com/scalaz/releases"
> val akkaVersion = "2.3.11"
> val sprayVersion = "1.3.3"
> libraryDependencies ++= Seq(
> "com.h2database" % "h2" % "1.4.187",
> "com.typesafe.akka" %% "akka-actor" % akkaVersion,
> "com.typesafe.akka" %% "akka-slf4j" % akkaVersion,
> "ch.qos.logback" % "logback-classic" % "1.0.13",
> "io.spray" %% "spray-can" % sprayVersion,
> "io.spray" %% "spray-routing" % sprayVersion,
> "io.spray" %% "spray-json" % "1.3.1",
> "com.databricks" %% "spark-csv" % "1.0.3",
> "org.specs2" %% "specs2" % "2.4.17" % "test",
> "org.specs2" %% "specs2-junit" % "2.4.17" % "test",
> "io.spray" %% "spray-testkit" % sprayVersion % "test",
> "com.typesafe.akka" %% "akka-testkit" % akkaVersion % "test",
> "junit" % "junit" % "4.12" % "test"
> )
> scalacOptions ++= Seq(
> "-unchecked",
> "-deprecation",
> "-Xlint",
> "-Ywarn-dead-code",
> "-language:_",
> "-target:jvm-1.7",
> "-encoding", "UTF-8"
> )
> testOptions += Tests.Argument(TestFrameworks.JUnit, "-v")
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org