You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@livy.apache.org by David Espinosa <es...@gmail.com> on 2018/02/22 22:30:32 UTC

Predef does not have a member classOf

Hi all,

I'm a new user in Livy. I've created a scala app that runs well with a
Spark 2.2.1. The version of scala used is 2.11.12..
I have tried to create a session in Livy 0.5.0 using this jar, and once the
Spark Session is created an error raises and the session goes dead.

This is a part of the error Stak:

"18/02/22 17:00:23 INFO SparkEntries: Spark context finished initialization
in 2282ms",
        "18/02/22 17:00:24 INFO SparkEntries: Created Spark session.",
        "Exception in thread \"main\" scala.reflect.internal.FatalError:
object Predef does not have a member classOf",
        "\tat scala.reflect.internal.Definitions$DefinitionsClass.
scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(
Definitions.scala:1186)",
        "\tat scala.reflect.internal.Definitions$DefinitionsClass.
getMember(Definitions.scala:1203)",
        "\tat scala.reflect.internal.Definitions$DefinitionsClass.
getMemberMethod(Definitions.scala:1238)",

Has somebody found and fixed a problem like this?

Thanks in advance,
David

Re: Predef does not have a member classOf

Posted by David Espinosa <es...@gmail.com>.
H SaiSai!
Thanks for your reply.
About how we use livy:

   - We have a service done in scala running in spark which provide us
   semantic search features. This is call Recommender.
   - We need to provide these results externally.
   - Our point is to create a REST API (java) that request Livy who will
   send the job to spark as well as will gather the results, so:
      - REST HTTP(user) --> CUSTOM REST API (java) --> Livy --> Spark

Livy Configurations:

   - We using Livy 0.5. Spark 2.2.1.
   - We are using standalone spark, so in the config we
   have: livy.spark.master = spark://XX.XXX.XXX.XX:7077
   - Usage: POST /sessions with body {"kind": "spark"}
   - Fat jar included in rsc-jars/, size 150MB. Scala jar compiled in
   2.11.12 (although we have tried to lower up to 2.11.0 with same results).
   Here some of the family dependencies of this fat jar:
      - com.fasterxml.jackson.core (2.6.5) | org.elasticsearch (6.0.1)
      | org.apache.spark (2.2.1) | JohnSnowLabs-spark-nlp (1.2.3)
      | org.apache.tika-tika-parsers (1.17) | net.ruippeixotog (2.0.0)
      | com.typesafe.akka (2.5.9) | om.github.romix.akka (0.5.0)
      | com.github.fommil.netlib (1.1.2) | com.vitorsvieira (0.1.2)
      | com.carrotsearch (1.0.0) | org.apache.jena (3.6.0)

The jar is working fine using spark-submit or even being used by Zeppelin.
As is a big file, I tried using Livi's addJar (in the create session req)
so it takes the jar located in the master, but as I'm not working on YARN
mode, I think that my approach is wrong.

Complete error Stack:

NFO SparkEntries: Spark context finished initialization in
1642ms","18/02/22 16:02:52 INFO SparkEntries: Created Spark
session.","Exception in thread \"main\"
scala.reflect.internal.FatalError: object Predef does not have a
member classOf","\tat
scala.reflect.internal.Definitions$DefinitionsClass.scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(Definitions.scala:1186)","\tat
scala.reflect.internal.Definitions$DefinitionsClass.getMember(Definitions.scala:1203)","\tat
scala.reflect.internal.Definitions$DefinitionsClass.getMemberMethod(Definitions.scala:1238)","\tat
scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf$lzycompute(Definitions.scala:1469)","\tat
scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf(Definitions.scala:1469)","\tat
scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.isPredefClassOf(Definitions.scala:1459)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedIdent$2(Typers.scala:4885)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedIdentOrWildcard$1(Typers.scala:4908)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5340)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5360)","\tat
scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5396)","\tat
scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5423)","\tat
scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5370)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5374)","\tat
scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5472)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5480)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5012)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5312)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5359)","\tat
scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5396)","\tat
scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5423)","\tat
scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5370)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5374)","\tat
scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36)","\tat
scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5448)","\tat
scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:102)","\tat
scala.tools.nsc.Global$GlobalPhase$$anonfun$applyPhase$1.apply$mcV$sp(Global.scala:440)","\tat
scala.tools.nsc.Global$GlobalPhase.withCurrentUnit(Global.scala:431)","\tat
scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:440)","\tat
scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:94)","\tat
scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:93)","\tat
scala.collection.Iterator$class.foreach(Iterator.scala:893)","\tat
scala.collection.AbstractIterator.foreach(Iterator.scala:1336)","\tat
scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:93)","\tat
scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1501)","\tat
scala.tools.nsc.Global$Run.compileUnits(Global.scala:1486)","\tat
scala.tools.nsc.Global$Run.compileSources(Global.scala:1481)","\tat
scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:435)","\tat
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:855)","\tat
scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:813)","\tat
scala.tools.nsc.interpreter.IMain.bind(IMain.scala:675)","\tat
org.apache.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:146)","\tat
org.apache.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:146)","\tat
scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)","\tat
org.apache.livy.repl.SparkInterpreter.bind(SparkInterpreter.scala:145)","\tat
org.apache.livy.repl.AbstractSparkInterpreter.postStart(AbstractSparkInterpreter.scala:72)","\tat
org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:95)","\tat
org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:70)","\tat
org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:70)","\tat
org.apache.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:340)","\tat
org.apache.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:70)","\tat
org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:128)","\tat
org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:122)","\tat
scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)","\tat
scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)","\tat
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)","\tat
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)","\tat
java.lang.Thread.run(Thread.java:745)","18/02/22 16:02:52 WARN
SingleThreadEventExecutor: An event executor terminated with non-empty
task queue (1)"]}


Some help would be very grateful!
David

2018-02-23 7:34 GMT+01:00 Saisai Shao <sa...@gmail.com>:

> This seems like a Scala version issue (as far as I can see from the
> stack). Would you please describe the details about how you use Livy and
> what configurations you set?
>
> Thanks
> Jerry
>
> 2018-02-23 6:30 GMT+08:00 David Espinosa <es...@gmail.com>:
>
>> Hi all,
>>
>> I'm a new user in Livy. I've created a scala app that runs well with a
>> Spark 2.2.1. The version of scala used is 2.11.12..
>> I have tried to create a session in Livy 0.5.0 using this jar, and once
>> the Spark Session is created an error raises and the session goes dead.
>>
>> This is a part of the error Stak:
>>
>> "18/02/22 17:00:23 INFO SparkEntries: Spark context finished
>> initialization in 2282ms",
>>         "18/02/22 17:00:24 INFO SparkEntries: Created Spark session.",
>>         "Exception in thread \"main\" scala.reflect.internal.FatalError:
>> object Predef does not have a member classOf",
>>         "\tat scala.reflect.internal.Definit
>> ions$DefinitionsClass.scala$reflect$internal$Definitions$Def
>> initionsClass$$fatalMissingSymbol(Definitions.scala:1186)",
>>         "\tat scala.reflect.internal.Definit
>> ions$DefinitionsClass.getMember(Definitions.scala:1203)",
>>         "\tat scala.reflect.internal.Definit
>> ions$DefinitionsClass.getMemberMethod(Definitions.scala:1238)",
>>
>> Has somebody found and fixed a problem like this?
>>
>> Thanks in advance,
>> David
>>
>>
>>
>

Re: Predef does not have a member classOf

Posted by Saisai Shao <sa...@gmail.com>.
This seems like a Scala version issue (as far as I can see from the stack).
Would you please describe the details about how you use Livy and what
configurations you set?

Thanks
Jerry

2018-02-23 6:30 GMT+08:00 David Espinosa <es...@gmail.com>:

> Hi all,
>
> I'm a new user in Livy. I've created a scala app that runs well with a
> Spark 2.2.1. The version of scala used is 2.11.12..
> I have tried to create a session in Livy 0.5.0 using this jar, and once
> the Spark Session is created an error raises and the session goes dead.
>
> This is a part of the error Stak:
>
> "18/02/22 17:00:23 INFO SparkEntries: Spark context finished
> initialization in 2282ms",
>         "18/02/22 17:00:24 INFO SparkEntries: Created Spark session.",
>         "Exception in thread \"main\" scala.reflect.internal.FatalError:
> object Predef does not have a member classOf",
>         "\tat scala.reflect.internal.Definitions$DefinitionsClass.scala$
> reflect$internal$Definitions$DefinitionsClass$$fatalMissing
> Symbol(Definitions.scala:1186)",
>         "\tat scala.reflect.internal.Definitions$DefinitionsClass.getMembe
> r(Definitions.scala:1203)",
>         "\tat scala.reflect.internal.Definitions$DefinitionsClass.getMembe
> rMethod(Definitions.scala:1238)",
>
> Has somebody found and fixed a problem like this?
>
> Thanks in advance,
> David
>
>
>