You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@livy.apache.org by "xianhua.chen (JIRA)" <ji...@apache.org> on 2018/07/19 03:17:00 UTC

[jira] [Created] (LIVY-486) livy 0.5 on cdh 5.14.0 throw java.io.FileNotFoundException

xianhua.chen created LIVY-486:
---------------------------------

             Summary: livy 0.5 on cdh 5.14.0 throw java.io.FileNotFoundException
                 Key: LIVY-486
                 URL: https://issues.apache.org/jira/browse/LIVY-486
             Project: Livy
          Issue Type: Question
            Reporter: xianhua.chen


hi:

my cluster is cdh 5.14.0  with spark2.2 .

livy 0.5.0 run with conf:

livy.conf:

livy.spark.master = yarn

livy.spark.deploy-mode = client

livy-env.sh:

export SPARK_HOME=/opt/cloudera/parcels/SPARK2/lib/spark2
export HADOOP_CONF_DIR=/etc/hadoop
export LIVY_LOG_DIR=/var/log/livy

create session with curl command:

curl -X POST -d '\{"kind": "spark"}' -H "Content-Type: application/json" localhost:8998/sessions

then the log as follow:
{code:java}
// code placeholder
{code}
18/07/19 10:11:16 INFO yarn.Client: client token: N/A diagnostics: N/A ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: root.users.hdfs start time: 1531966275080 final status: UNDEFINED tracking URL: http://scm.iwellmass.com:8088/proxy/application_1531876562801_0010/ user: hdfs 18/07/19 10:11:17 INFO yarn.Client: Application report for application_1531876562801_0010 (state: FAILED) 18/07/19 10:11:17 INFO yarn.Client: client token: N/A diagnostics: Application application_1531876562801_0010 failed 1 times due to AM Container for appattempt_1531876562801_0010_000001 exited with exitCode: -1000 For more detailed output, check application tracking page:http://scm.iwellmass.com:8088/proxy/application_1531876562801_0010/Then, click on links to logs of each attempt. Diagnostics: java.io.FileNotFoundException: File file:/opt/livy-0.5.0-incubating-bin/repl_2.11-jars/commons-codec-1.9.jar does not exist Failing this attempt. Failing the application. ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: root.users.hdfs start time: 1531966275080 final status: FAILED tracking URL: http://scm.iwellmass.com:8088/cluster/app/application_1531876562801_0010 user: hdfs 18/07/19 10:11:17 INFO yarn.Client: Deleted staging directory file:/var/lib/hadoop-hdfs/.sparkStaging/application_1531876562801_0010 18/07/19 10:11:17 ERROR spark.SparkContext: Error initializing SparkContext. org.apache.spark.SparkException: Yarn application has already ended! It might have been killed or unable to launch application master. at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.waitForApplication(YarnClientSchedulerBackend.scala:85) at org.apache.spark.scheduler.cluster.YarnClientSchedulerBackend.start(YarnClientSchedulerBackend.scala:62) at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:165) at org.apache.spark.SparkContext.<init>(SparkContext.scala:512) at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2511) at org.apache.spark.SparkContext.getOrCreate(SparkContext.scala) at org.apache.livy.rsc.driver.SparkEntries.sc(SparkEntries.java:51) at org.apache.livy.rsc.driver.SparkEntries.sparkSession(SparkEntries.java:72) at org.apache.livy.repl.AbstractSparkInterpreter.postStart(AbstractSparkInterpreter.scala:69) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:95) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:70) at org.apache.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:70) at org.apache.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:340) at org.apache.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:70) at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:128) at org.apache.livy.repl.Session$$anonfun$1.apply(Session.scala:122) at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 18/07/19 10:11:17 INFO server.AbstractConnector: Stopped Spark@2fa7233\{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 18/07/19 10:11:17 INFO ui.SparkUI: Stopped Spark web UI at [http://192.168.10.234:4040|http://192.168.10.234:4040/]
{code:java}
// code placeholder
{code}
is there any solutions?

thanks

 

 

 

 

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)