You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@livy.apache.org by "locona (Jira)" <ji...@apache.org> on 2020/06/19 20:12:00 UTC
[jira] [Comment Edited] (LIVY-636) Unable to create interactive
session with additional JAR in spark.driver.extraClassPath
[ https://issues.apache.org/jira/browse/LIVY-636?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17140797#comment-17140797 ]
locona edited comment on LIVY-636 at 6/19/20, 8:11 PM:
-------------------------------------------------------
Hi [~glennthomas] cc [~ishitavirmani]
Is there any progress here?
It occurs in the following cases
{code:java}
// livy.conf
livy.spark.master=yarn
livy.spark.deploy-mode = client
{code}
Is there anything related to the following PR?
[https://github.com/cloudera/livy/pull/312/files]
{code:java}
// error message
20/06/19 19:19:10 WARN org.apache.spark.deploy.yarn.Client: Same name resource file:///usr/lib/spark/python/lib/pyspark.zip added multiple times to distributed cache
20/06/19 19:19:10 WARN org.apache.spark.deploy.yarn.Client: Same name resource file:///usr/lib/spark/python/lib/py4j-0.10.7-src.zip added multiple times to distributed cache
20/06/19 19:19:12 INFO org.apache.hadoop.yarn.client.api.impl.YarnClientImpl: Submitted application application_1592570348792_0021
20/06/19 19:19:19 INFO org.apache.livy.rsc.driver.SparkEntries: Spark context finished initialization in 14871ms
20/06/19 19:19:19 INFO org.apache.livy.rsc.driver.SparkEntries: Created Spark session.
20/06/19 19:19:20 WARN org.apache.livy.rsc.driver.RSCDriver: Error during cancel job.
java.lang.NullPointerException
at org.apache.livy.rsc.driver.JobWrapper.cancel(JobWrapper.java:90)
at org.apache.livy.rsc.driver.RSCDriver.shutdown(RSCDriver.java:128)
at org.apache.livy.rsc.driver.RSCDriver.run(RSCDriver.java:360)
at org.apache.livy.rsc.driver.RSCDriverBootstrapper.main(RSCDriverBootstrapper.java:93)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Exception in thread "main" scala.reflect.internal.FatalError: object Predef does not have a member classOf
at scala.reflect.internal.Definitions$DefinitionsClass.scala$reflect$internal$Definitions$DefinitionsClass$$fatalMissingSymbol(Definitions.scala:1182)
at scala.reflect.internal.Definitions$DefinitionsClass.getMember(Definitions.scala:1199)
at scala.reflect.internal.Definitions$DefinitionsClass.getMemberMethod(Definitions.scala:1234)
at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf$lzycompute(Definitions.scala:1465)
at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.Predef_classOf(Definitions.scala:1465)
at scala.reflect.internal.Definitions$DefinitionsClass$RunDefinitions.isPredefClassOf(Definitions.scala:1455)
at scala.tools.nsc.typechecker.Typers$Typer.typedIdent$2(Typers.scala:4912)
at scala.tools.nsc.typechecker.Typers$Typer.typedIdentOrWildcard$1(Typers.scala:4935)
at scala.tools.nsc.typechecker.Typers$Typer.typedInAnyMode$1(Typers.scala:5367)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5387)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
at scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36)
at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5501)
at scala.tools.nsc.typechecker.Typers$Typer.typedQualifier(Typers.scala:5509)
at scala.tools.nsc.typechecker.Typers$Typer.typedPackageDef$1(Typers.scala:5039)
at scala.tools.nsc.typechecker.Typers$Typer.typedMemberDef$1(Typers.scala:5339)
at scala.tools.nsc.typechecker.Typers$Typer.typed1(Typers.scala:5386)
at scala.tools.nsc.typechecker.Typers$Typer.runTyper$1(Typers.scala:5423)
at scala.tools.nsc.typechecker.Typers$Typer.scala$tools$nsc$typechecker$Typers$Typer$$typedInternal(Typers.scala:5450)
at scala.tools.nsc.typechecker.Typers$Typer.body$2(Typers.scala:5397)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5401)
at scala.tools.nsc.interpreter.ReplGlobal$$anon$1$$anon$2.typed(ReplGlobal.scala:36)
at scala.tools.nsc.typechecker.Typers$Typer.typed(Typers.scala:5477)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.apply(Analyzer.scala:102)
at scala.tools.nsc.Global$GlobalPhase$$anonfun$applyPhase$1.apply$mcV$sp(Global.scala:467)
at scala.tools.nsc.Global$GlobalPhase.withCurrentUnit(Global.scala:458)
at scala.tools.nsc.Global$GlobalPhase.applyPhase(Global.scala:467)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:94)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3$$anonfun$run$1.apply(Analyzer.scala:93)
at scala.collection.Iterator$class.foreach(Iterator.scala:891)
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334)
at scala.tools.nsc.typechecker.Analyzer$typerFactory$$anon$3.run(Analyzer.scala:93)
at scala.tools.nsc.Global$Run.compileUnitsInternal(Global.scala:1528)
at scala.tools.nsc.Global$Run.compileUnits(Global.scala:1513)
at scala.tools.nsc.Global$Run.compileSources(Global.scala:1508)
at scala.tools.nsc.interpreter.IMain.compileSourcesKeepingRun(IMain.scala:442)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compileAndSaveRun(IMain.scala:862)
at scala.tools.nsc.interpreter.IMain$ReadEvalPrint.compile(IMain.scala:820)
at scala.tools.nsc.interpreter.IMain.bind(IMain.scala:682)
at org.apache.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:132)
at org.apache.livy.repl.SparkInterpreter$$anonfun$bind$1.apply(SparkInterpreter.scala:132)
at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:221)
at org.apache.livy.repl.SparkInterpreter.bind(SparkInterpreter.scala:131)
at org.apache.livy.repl.AbstractSparkInterpreter.postStart(AbstractSparkInterpreter.scala:72)
at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:88)
at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:63)
at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:63)
at org.apache.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:340)
at org.apache.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:63)
at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:128)
at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:122)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
{code}
was (Author: locona):
Hi [~glennthomas] cc [~ishitavirmani]
Is there any progress here?
It occurs in the following cases
{code:java}
// livy.conf
livy.spark.master=yarn
livy.spark.deploy-mode = client
{code}
Is there anything related to the following PR?
[https://github.com/cloudera/livy/pull/312/files]
> Unable to create interactive session with additional JAR in spark.driver.extraClassPath
> ---------------------------------------------------------------------------------------
>
> Key: LIVY-636
> URL: https://issues.apache.org/jira/browse/LIVY-636
> Project: Livy
> Issue Type: Bug
> Affects Versions: 0.6.0
> Reporter: Ishita Virmani
> Priority: Major
> Attachments: applicationmaster.log, container.log, stacktrace.txt, test.png
>
>
> Command Run: c{{url -H "Content-Type: application/json" -X POST -d '\{"kind":"pyspark","conf":{"spark.driver.extraClassPath":"/data/XXX-0.0.1-SNAPSHOT.jar"}}' -i http://<LIVY_SERVER_IP:PORT>/session}}
> {{The above command fails to create a Spark Session on YARN with Null pointer exception. Stack trace for the same has been attached along-with.}}
> The JAR file here is present on local driver Path. Also tried using HDFS path in the following manner {{hdfs://<NM_IP>:<NM_Port>/data/XXX-0.0.1-SNAPSHOT.jar}}
--
This message was sent by Atlassian Jira
(v8.3.4#803005)