You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sunita Koppar (JIRA)" <ji...@apache.org> on 2015/12/10 18:22:11 UTC

[jira] [Commented] (SPARK-5281) Registering table on RDD is giving MissingRequirementError

    [ https://issues.apache.org/jira/browse/SPARK-5281?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15051286#comment-15051286 ] 

Sunita Koppar commented on SPARK-5281:
--------------------------------------

We tried this workaround as well and the application works perfectly, however, we are not able to use the ScalaTest and write unit tests which can invoke the application code itself. The test with below error:

Exception in thread "main" java.lang.NoClassDefFoundError: scala/Function1
	at scala.tools.eclipse.scalatest.launching.ScalaTestLauncher.main(ScalaTestLauncher.scala)
Caused by: java.lang.ClassNotFoundException: scala.Function1
	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

Which essentially means it cannot find the library

> Registering table on RDD is giving MissingRequirementError
> ----------------------------------------------------------
>
>                 Key: SPARK-5281
>                 URL: https://issues.apache.org/jira/browse/SPARK-5281
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.2.0, 1.3.1
>            Reporter: sarsol
>            Assignee: Iulian Dragos
>            Priority: Critical
>             Fix For: 1.4.0
>
>
> Application crashes on this line  {{rdd.registerTempTable("temp")}}  in 1.2 version when using sbt or Eclipse SCALA IDE
> Stacktrace:
> {code}
> Exception in thread "main" scala.reflect.internal.MissingRequirementError: class org.apache.spark.sql.catalyst.ScalaReflection in JavaMirror with primordial classloader with boot classpath [C:\sar\scala\scala-ide\eclipse\plugins\org.scala-ide.scala210.jars_4.0.0.201407240952\target\jars\scala-library.jar;C:\sar\scala\scala-ide\eclipse\plugins\org.scala-ide.scala210.jars_4.0.0.201407240952\target\jars\scala-reflect.jar;C:\sar\scala\scala-ide\eclipse\plugins\org.scala-ide.scala210.jars_4.0.0.201407240952\target\jars\scala-actor.jar;C:\sar\scala\scala-ide\eclipse\plugins\org.scala-ide.scala210.jars_4.0.0.201407240952\target\jars\scala-swing.jar;C:\sar\scala\scala-ide\eclipse\plugins\org.scala-ide.scala210.jars_4.0.0.201407240952\target\jars\scala-compiler.jar;C:\Program Files\Java\jre7\lib\resources.jar;C:\Program Files\Java\jre7\lib\rt.jar;C:\Program Files\Java\jre7\lib\sunrsasign.jar;C:\Program Files\Java\jre7\lib\jsse.jar;C:\Program Files\Java\jre7\lib\jce.jar;C:\Program Files\Java\jre7\lib\charsets.jar;C:\Program Files\Java\jre7\lib\jfr.jar;C:\Program Files\Java\jre7\classes] not found.
> 	at scala.reflect.internal.MissingRequirementError$.signal(MissingRequirementError.scala:16)
> 	at scala.reflect.internal.MissingRequirementError$.notFound(MissingRequirementError.scala:17)
> 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:48)
> 	at scala.reflect.internal.Mirrors$RootsBase.getModuleOrClass(Mirrors.scala:61)
> 	at scala.reflect.internal.Mirrors$RootsBase.staticModuleOrClass(Mirrors.scala:72)
> 	at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:119)
> 	at scala.reflect.internal.Mirrors$RootsBase.staticClass(Mirrors.scala:21)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$$typecreator1$1.apply(ScalaReflection.scala:115)
> 	at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe$lzycompute(TypeTags.scala:231)
> 	at scala.reflect.api.TypeTags$WeakTypeTagImpl.tpe(TypeTags.scala:231)
> 	at scala.reflect.api.TypeTags$class.typeOf(TypeTags.scala:335)
> 	at scala.reflect.api.Universe.typeOf(Universe.scala:59)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:115)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$class.schemaFor(ScalaReflection.scala:100)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:33)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$class.attributesFor(ScalaReflection.scala:94)
> 	at org.apache.spark.sql.catalyst.ScalaReflection$.attributesFor(ScalaReflection.scala:33)
> 	at org.apache.spark.sql.SQLContext.createSchemaRDD(SQLContext.scala:111)
> 	at com.sar.spark.dq.poc.SparkPOC$delayedInit$body.apply(SparkPOC.scala:43)
> 	at scala.Function0$class.apply$mcV$sp(Function0.scala:40)
> 	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> 	at scala.App$$anonfun$main$1.apply(App.scala:71)
> 	at scala.App$$anonfun$main$1.apply(App.scala:71)
> 	at scala.collection.immutable.List.foreach(List.scala:318)
> 	at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:32)
> 	at scala.App$class.main(App.scala:71)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org