You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Иван Шаповалов <sh...@gmail.com> on 2017/06/26 10:40:19 UTC
Livy failure
Hi all,
I am trying to connect and run note via livy
*created interpreter setting:*
livy.spark.driver.cores 1
livy.spark.driver.memory 1g
livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
livy.spark.dynamicAllocation.enabled true
livy.spark.dynamicAllocation.initialExecutors 1
livy.spark.dynamicAllocation.maxExecutors 2
livy.spark.dynamicAllocation.minExecutors 1
livy.spark.executor.cores 1
livy.spark.executor.instances 3
livy.spark.executor.memory 1g
livy.spark.jars.packages
zeppelin.livy.concurrentSQL false
zeppelin.livy.displayAppInfo false
zeppelin.livy.keytab
zeppelin.livy.principal zeppelin
zeppelin.livy.pull_status.interval.millis 1000
zeppelin.livy.session.create_timeout 600
zeppelin.livy.spark.sql.maxResult 1000
zeppelin.livy.url http://correct-path:correct-port
*and have following livy server config:*
livy.impersonation.enabled = false
livy.repl.enableHiveContext = true
ivy.server.csrf_protection.enabled = true
livy.spark.master = yarn.cluster
*Trying to run paragraph drives me to:*
INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2}
BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state
shutting_down, appId application_1498447942743_0011
INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2}
BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state
dead, appId application_1498447942743_0011
ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2}
BaseLivyInterprereter.java[createSession]:214) - Error when creating
livy session for user anonymous
org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId:
application_1498447942743_0011, log: [YARN Diagnostics:, AM container
is launched, waiting for AM container to Register with RM]
at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Could anyone please help to understand what am I doing wrong ?
Thanks
--
Ivan Shapovalov
Kharkov, Ukraine
Re: Livy failure
Posted by Иван Шаповалов <sh...@gmail.com>.
Got it, thank you.
2017-06-26 14:58 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
> The error message is clear, you didn't set the right configuration. You
> enable the dynamic allocation, and set max executor as 2 but you set the
> initial executor as 3
>
>
>
> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
>
>
>
>
> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午7:53写道:
>
>> Here is the failure from resource manager
>>
>> 17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
>> org.apache.spark.SparkException: Exception thrown in awaitResult:
>> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
>> at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
>> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
>> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
>> at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
>> at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
>> Caused by: java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
>> at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
>> at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
>> at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
>> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
>> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
>> at scala.Predef$.require(Predef.scala:224)
>> at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
>> at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
>> at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
>> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
>> at scala.Option.getOrElse(Option.scala:121)
>> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
>> ... 19 more
>> 17/06/26 05:55:43 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.reflect.InvocationTargetException)
>> 17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
>>
>>
>> thank you
>>
>>
>> 2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>>
>>>
>>> Could you check the yarn app log ?
>>>
>>>
>>>
>>> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>>>
>>>> Hi all,
>>>> I am trying to connect and run note via livy
>>>>
>>>> *created interpreter setting:*
>>>> livy.spark.driver.cores 1
>>>> livy.spark.driver.memory 1g
>>>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>>>> livy.spark.dynamicAllocation.enabled true
>>>> livy.spark.dynamicAllocation.initialExecutors 1
>>>> livy.spark.dynamicAllocation.maxExecutors 2
>>>> livy.spark.dynamicAllocation.minExecutors 1
>>>> livy.spark.executor.cores 1
>>>> livy.spark.executor.instances 3
>>>> livy.spark.executor.memory 1g
>>>> livy.spark.jars.packages
>>>> zeppelin.livy.concurrentSQL false
>>>> zeppelin.livy.displayAppInfo false
>>>> zeppelin.livy.keytab
>>>> zeppelin.livy.principal zeppelin
>>>> zeppelin.livy.pull_status.interval.millis 1000
>>>> zeppelin.livy.session.create_timeout 600
>>>> zeppelin.livy.spark.sql.maxResult 1000
>>>> zeppelin.livy.url http://correct-path:correct-port
>>>>
>>>> *and have following livy server config:*
>>>> livy.impersonation.enabled = false
>>>> livy.repl.enableHiveContext = true
>>>> ivy.server.csrf_protection.enabled = true
>>>> livy.spark.master = yarn.cluster
>>>>
>>>> *Trying to run paragraph drives me to:*
>>>>
>>>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>>>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>>>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>>>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>>>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Could anyone please help to understand what am I doing wrong ?
>>>> Thanks
>>>> --
>>>> Ivan Shapovalov
>>>> Kharkov, Ukraine
>>>>
>>>>
>>
>>
>> --
>> Ivan Shapovalov
>> Kharkov, Ukraine
>>
>>
--
Ivan Shapovalov
Kharkov, Ukraine
Re: Livy failure
Posted by Иван Шаповалов <sh...@gmail.com>.
Got it, thank you.
2017-06-26 14:58 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
> The error message is clear, you didn't set the right configuration. You
> enable the dynamic allocation, and set max executor as 2 but you set the
> initial executor as 3
>
>
>
> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
>
>
>
>
> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午7:53写道:
>
>> Here is the failure from resource manager
>>
>> 17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
>> org.apache.spark.SparkException: Exception thrown in awaitResult:
>> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
>> at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
>> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
>> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
>> at java.security.AccessController.doPrivileged(Native Method)
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
>> at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
>> at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
>> at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
>> Caused by: java.lang.reflect.InvocationTargetException
>> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>> at java.lang.reflect.Method.invoke(Method.java:498)
>> at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
>> at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
>> at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
>> at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
>> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
>> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
>> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
>> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
>> at scala.Predef$.require(Predef.scala:224)
>> at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
>> at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
>> at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
>> at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
>> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
>> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
>> at scala.Option.getOrElse(Option.scala:121)
>> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
>> ... 19 more
>> 17/06/26 05:55:43 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.reflect.InvocationTargetException)
>> 17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
>>
>>
>> thank you
>>
>>
>> 2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>>
>>>
>>> Could you check the yarn app log ?
>>>
>>>
>>>
>>> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>>>
>>>> Hi all,
>>>> I am trying to connect and run note via livy
>>>>
>>>> *created interpreter setting:*
>>>> livy.spark.driver.cores 1
>>>> livy.spark.driver.memory 1g
>>>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>>>> livy.spark.dynamicAllocation.enabled true
>>>> livy.spark.dynamicAllocation.initialExecutors 1
>>>> livy.spark.dynamicAllocation.maxExecutors 2
>>>> livy.spark.dynamicAllocation.minExecutors 1
>>>> livy.spark.executor.cores 1
>>>> livy.spark.executor.instances 3
>>>> livy.spark.executor.memory 1g
>>>> livy.spark.jars.packages
>>>> zeppelin.livy.concurrentSQL false
>>>> zeppelin.livy.displayAppInfo false
>>>> zeppelin.livy.keytab
>>>> zeppelin.livy.principal zeppelin
>>>> zeppelin.livy.pull_status.interval.millis 1000
>>>> zeppelin.livy.session.create_timeout 600
>>>> zeppelin.livy.spark.sql.maxResult 1000
>>>> zeppelin.livy.url http://correct-path:correct-port
>>>>
>>>> *and have following livy server config:*
>>>> livy.impersonation.enabled = false
>>>> livy.repl.enableHiveContext = true
>>>> ivy.server.csrf_protection.enabled = true
>>>> livy.spark.master = yarn.cluster
>>>>
>>>> *Trying to run paragraph drives me to:*
>>>>
>>>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>>>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>>>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>>>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>>>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>> at java.lang.Thread.run(Thread.java:745)
>>>>
>>>> Could anyone please help to understand what am I doing wrong ?
>>>> Thanks
>>>> --
>>>> Ivan Shapovalov
>>>> Kharkov, Ukraine
>>>>
>>>>
>>
>>
>> --
>> Ivan Shapovalov
>> Kharkov, Ukraine
>>
>>
--
Ivan Shapovalov
Kharkov, Ukraine
Re: Livy failure
Posted by Jeff Zhang <zj...@gmail.com>.
The error message is clear, you didn't set the right configuration. You
enable the dynamic allocation, and set max executor as 2 but you set the
initial executor as 3
Caused by: java.lang.IllegalArgumentException: requirement failed:
initial executor number 3 must between min executor number 1 and max
executor number 2
Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午7:53写道:
> Here is the failure from resource manager
>
> 17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
> org.apache.spark.SparkException: Exception thrown in awaitResult:
> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
> at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
> at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
> at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
> at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
> at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
> at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
> at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
> at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
> at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
> at scala.Predef$.require(Predef.scala:224)
> at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
> at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
> at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
> ... 19 more
> 17/06/26 05:55:43 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.reflect.InvocationTargetException)
> 17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
>
>
> thank you
>
>
> 2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
>>
>> Could you check the yarn app log ?
>>
>>
>>
>> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>>
>>> Hi all,
>>> I am trying to connect and run note via livy
>>>
>>> *created interpreter setting:*
>>> livy.spark.driver.cores 1
>>> livy.spark.driver.memory 1g
>>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>>> livy.spark.dynamicAllocation.enabled true
>>> livy.spark.dynamicAllocation.initialExecutors 1
>>> livy.spark.dynamicAllocation.maxExecutors 2
>>> livy.spark.dynamicAllocation.minExecutors 1
>>> livy.spark.executor.cores 1
>>> livy.spark.executor.instances 3
>>> livy.spark.executor.memory 1g
>>> livy.spark.jars.packages
>>> zeppelin.livy.concurrentSQL false
>>> zeppelin.livy.displayAppInfo false
>>> zeppelin.livy.keytab
>>> zeppelin.livy.principal zeppelin
>>> zeppelin.livy.pull_status.interval.millis 1000
>>> zeppelin.livy.session.create_timeout 600
>>> zeppelin.livy.spark.sql.maxResult 1000
>>> zeppelin.livy.url http://correct-path:correct-port
>>>
>>> *and have following livy server config:*
>>> livy.impersonation.enabled = false
>>> livy.repl.enableHiveContext = true
>>> ivy.server.csrf_protection.enabled = true
>>> livy.spark.master = yarn.cluster
>>>
>>> *Trying to run paragraph drives me to:*
>>>
>>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Could anyone please help to understand what am I doing wrong ?
>>> Thanks
>>> --
>>> Ivan Shapovalov
>>> Kharkov, Ukraine
>>>
>>>
>
>
> --
> Ivan Shapovalov
> Kharkov, Ukraine
>
>
Re: Livy failure
Posted by Jeff Zhang <zj...@gmail.com>.
The error message is clear, you didn't set the right configuration. You
enable the dynamic allocation, and set max executor as 2 but you set the
initial executor as 3
Caused by: java.lang.IllegalArgumentException: requirement failed:
initial executor number 3 must between min executor number 1 and max
executor number 2
Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午7:53写道:
> Here is the failure from resource manager
>
> 17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
> org.apache.spark.SparkException: Exception thrown in awaitResult:
> at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
> at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
> at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
> at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
> at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
> at java.security.AccessController.doPrivileged(Native Method)
> at javax.security.auth.Subject.doAs(Subject.java:422)
> at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
> at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
> at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
> at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
> Caused by: java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
> at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
> at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
> at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
> at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
> at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.IllegalArgumentException: requirement failed: initial executor number 3 must between min executor number 1 and max executor number 2
> at scala.Predef$.require(Predef.scala:224)
> at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
> at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
> at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
> at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
> at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
> at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
> at scala.Option.getOrElse(Option.scala:121)
> at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
> ... 19 more
> 17/06/26 05:55:43 INFO ApplicationMaster: Unregistering ApplicationMaster with FAILED (diag message: User class threw exception: java.lang.reflect.InvocationTargetException)
> 17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
>
>
> thank you
>
>
> 2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
>>
>> Could you check the yarn app log ?
>>
>>
>>
>> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>>
>>> Hi all,
>>> I am trying to connect and run note via livy
>>>
>>> *created interpreter setting:*
>>> livy.spark.driver.cores 1
>>> livy.spark.driver.memory 1g
>>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>>> livy.spark.dynamicAllocation.enabled true
>>> livy.spark.dynamicAllocation.initialExecutors 1
>>> livy.spark.dynamicAllocation.maxExecutors 2
>>> livy.spark.dynamicAllocation.minExecutors 1
>>> livy.spark.executor.cores 1
>>> livy.spark.executor.instances 3
>>> livy.spark.executor.memory 1g
>>> livy.spark.jars.packages
>>> zeppelin.livy.concurrentSQL false
>>> zeppelin.livy.displayAppInfo false
>>> zeppelin.livy.keytab
>>> zeppelin.livy.principal zeppelin
>>> zeppelin.livy.pull_status.interval.millis 1000
>>> zeppelin.livy.session.create_timeout 600
>>> zeppelin.livy.spark.sql.maxResult 1000
>>> zeppelin.livy.url http://correct-path:correct-port
>>>
>>> *and have following livy server config:*
>>> livy.impersonation.enabled = false
>>> livy.repl.enableHiveContext = true
>>> ivy.server.csrf_protection.enabled = true
>>> livy.spark.master = yarn.cluster
>>>
>>> *Trying to run paragraph drives me to:*
>>>
>>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>> at java.lang.Thread.run(Thread.java:745)
>>>
>>> Could anyone please help to understand what am I doing wrong ?
>>> Thanks
>>> --
>>> Ivan Shapovalov
>>> Kharkov, Ukraine
>>>
>>>
>
>
> --
> Ivan Shapovalov
> Kharkov, Ukraine
>
>
Re: Livy failure
Posted by Иван Шаповалов <sh...@gmail.com>.
Here is the failure from resource manager
17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: requirement failed:
initial executor number 3 must between min executor number 1 and max
executor number 2
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
... 19 more
17/06/26 05:55:43 INFO ApplicationMaster: Unregistering
ApplicationMaster with FAILED (diag message: User class threw
exception: java.lang.reflect.InvocationTargetException)
17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
thank you
2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
> Could you check the yarn app log ?
>
>
>
> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>
>> Hi all,
>> I am trying to connect and run note via livy
>>
>> *created interpreter setting:*
>> livy.spark.driver.cores 1
>> livy.spark.driver.memory 1g
>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>> livy.spark.dynamicAllocation.enabled true
>> livy.spark.dynamicAllocation.initialExecutors 1
>> livy.spark.dynamicAllocation.maxExecutors 2
>> livy.spark.dynamicAllocation.minExecutors 1
>> livy.spark.executor.cores 1
>> livy.spark.executor.instances 3
>> livy.spark.executor.memory 1g
>> livy.spark.jars.packages
>> zeppelin.livy.concurrentSQL false
>> zeppelin.livy.displayAppInfo false
>> zeppelin.livy.keytab
>> zeppelin.livy.principal zeppelin
>> zeppelin.livy.pull_status.interval.millis 1000
>> zeppelin.livy.session.create_timeout 600
>> zeppelin.livy.spark.sql.maxResult 1000
>> zeppelin.livy.url http://correct-path:correct-port
>>
>> *and have following livy server config:*
>> livy.impersonation.enabled = false
>> livy.repl.enableHiveContext = true
>> ivy.server.csrf_protection.enabled = true
>> livy.spark.master = yarn.cluster
>>
>> *Trying to run paragraph drives me to:*
>>
>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Could anyone please help to understand what am I doing wrong ?
>> Thanks
>> --
>> Ivan Shapovalov
>> Kharkov, Ukraine
>>
>>
--
Ivan Shapovalov
Kharkov, Ukraine
Re: Livy failure
Posted by Иван Шаповалов <sh...@gmail.com>.
Here is the failure from resource manager
17/06/26 05:55:43 ERROR ApplicationMaster: Uncaught exception:
org.apache.spark.SparkException: Exception thrown in awaitResult:
at org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:194)
at org.apache.spark.deploy.yarn.ApplicationMaster.runDriver(ApplicationMaster.scala:401)
at org.apache.spark.deploy.yarn.ApplicationMaster.run(ApplicationMaster.scala:254)
at org.apache.spark.deploy.yarn.ApplicationMaster$$anonfun$main$1.apply$mcV$sp(ApplicationMaster.scala:766)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:67)
at org.apache.spark.deploy.SparkHadoopUtil$$anon$1.run(SparkHadoopUtil.scala:66)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866)
at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:66)
at org.apache.spark.deploy.yarn.ApplicationMaster$.main(ApplicationMaster.scala:764)
at org.apache.spark.deploy.yarn.ApplicationMaster.main(ApplicationMaster.scala)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at com.cloudera.livy.repl.SparkContextInitializer$class.spark2CreateContext(SparkContextInitializer.scala:94)
at com.cloudera.livy.repl.SparkContextInitializer$class.createSparkContext(SparkContextInitializer.scala:34)
at com.cloudera.livy.repl.SparkInterpreter.createSparkContext(SparkInterpreter.scala:36)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply$mcV$sp(SparkInterpreter.scala:89)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.SparkInterpreter$$anonfun$start$1.apply(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.AbstractSparkInterpreter.restoreContextClassLoader(AbstractSparkInterpreter.scala:256)
at com.cloudera.livy.repl.SparkInterpreter.start(SparkInterpreter.scala:68)
at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:76)
at com.cloudera.livy.repl.Session$$anonfun$1.apply(Session.scala:74)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24)
at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.IllegalArgumentException: requirement failed:
initial executor number 3 must between min executor number 1 and max
executor number 2
at scala.Predef$.require(Predef.scala:224)
at org.apache.spark.deploy.yarn.YarnSparkHadoopUtil$.getInitialTargetExecutorNumber(YarnSparkHadoopUtil.scala:304)
at org.apache.spark.scheduler.cluster.YarnClusterSchedulerBackend.start(YarnClusterSchedulerBackend.scala:37)
at org.apache.spark.scheduler.TaskSchedulerImpl.start(TaskSchedulerImpl.scala:156)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:509)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2320)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:868)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$6.apply(SparkSession.scala:860)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:860)
... 19 more
17/06/26 05:55:43 INFO ApplicationMaster: Unregistering
ApplicationMaster with FAILED (diag message: User class threw
exception: java.lang.reflect.InvocationTargetException)
17/06/26 05:55:43 INFO ApplicationMaster: Deleting staging directory
thank you
2017-06-26 14:16 GMT+03:00 Jeff Zhang <zj...@gmail.com>:
>
> Could you check the yarn app log ?
>
>
>
> Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
>
>> Hi all,
>> I am trying to connect and run note via livy
>>
>> *created interpreter setting:*
>> livy.spark.driver.cores 1
>> livy.spark.driver.memory 1g
>> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
>> livy.spark.dynamicAllocation.enabled true
>> livy.spark.dynamicAllocation.initialExecutors 1
>> livy.spark.dynamicAllocation.maxExecutors 2
>> livy.spark.dynamicAllocation.minExecutors 1
>> livy.spark.executor.cores 1
>> livy.spark.executor.instances 3
>> livy.spark.executor.memory 1g
>> livy.spark.jars.packages
>> zeppelin.livy.concurrentSQL false
>> zeppelin.livy.displayAppInfo false
>> zeppelin.livy.keytab
>> zeppelin.livy.principal zeppelin
>> zeppelin.livy.pull_status.interval.millis 1000
>> zeppelin.livy.session.create_timeout 600
>> zeppelin.livy.spark.sql.maxResult 1000
>> zeppelin.livy.url http://correct-path:correct-port
>>
>> *and have following livy server config:*
>> livy.impersonation.enabled = false
>> livy.repl.enableHiveContext = true
>> ivy.server.csrf_protection.enabled = true
>> livy.spark.master = yarn.cluster
>>
>> *Trying to run paragraph drives me to:*
>>
>> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
>> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
>> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
>> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
>> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
>> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
>> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
>> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
>> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
>> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
>> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>> at java.lang.Thread.run(Thread.java:745)
>>
>> Could anyone please help to understand what am I doing wrong ?
>> Thanks
>> --
>> Ivan Shapovalov
>> Kharkov, Ukraine
>>
>>
--
Ivan Shapovalov
Kharkov, Ukraine
Re: Livy failure
Posted by Jeff Zhang <zj...@gmail.com>.
Could you check the yarn app log ?
Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
> Hi all,
> I am trying to connect and run note via livy
>
> *created interpreter setting:*
> livy.spark.driver.cores 1
> livy.spark.driver.memory 1g
> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
> livy.spark.dynamicAllocation.enabled true
> livy.spark.dynamicAllocation.initialExecutors 1
> livy.spark.dynamicAllocation.maxExecutors 2
> livy.spark.dynamicAllocation.minExecutors 1
> livy.spark.executor.cores 1
> livy.spark.executor.instances 3
> livy.spark.executor.memory 1g
> livy.spark.jars.packages
> zeppelin.livy.concurrentSQL false
> zeppelin.livy.displayAppInfo false
> zeppelin.livy.keytab
> zeppelin.livy.principal zeppelin
> zeppelin.livy.pull_status.interval.millis 1000
> zeppelin.livy.session.create_timeout 600
> zeppelin.livy.spark.sql.maxResult 1000
> zeppelin.livy.url http://correct-path:correct-port
>
> *and have following livy server config:*
> livy.impersonation.enabled = false
> livy.repl.enableHiveContext = true
> ivy.server.csrf_protection.enabled = true
> livy.spark.master = yarn.cluster
>
> *Trying to run paragraph drives me to:*
>
> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Could anyone please help to understand what am I doing wrong ?
> Thanks
> --
> Ivan Shapovalov
> Kharkov, Ukraine
>
>
Re: Livy failure
Posted by Jeff Zhang <zj...@gmail.com>.
Could you check the yarn app log ?
Иван Шаповалов <sh...@gmail.com>于2017年6月26日周一 下午6:40写道:
> Hi all,
> I am trying to connect and run note via livy
>
> *created interpreter setting:*
> livy.spark.driver.cores 1
> livy.spark.driver.memory 1g
> livy.spark.dynamicAllocation.cachedExecutorIdleTimeout 600
> livy.spark.dynamicAllocation.enabled true
> livy.spark.dynamicAllocation.initialExecutors 1
> livy.spark.dynamicAllocation.maxExecutors 2
> livy.spark.dynamicAllocation.minExecutors 1
> livy.spark.executor.cores 1
> livy.spark.executor.instances 3
> livy.spark.executor.memory 1g
> livy.spark.jars.packages
> zeppelin.livy.concurrentSQL false
> zeppelin.livy.displayAppInfo false
> zeppelin.livy.keytab
> zeppelin.livy.principal zeppelin
> zeppelin.livy.pull_status.interval.millis 1000
> zeppelin.livy.session.create_timeout 600
> zeppelin.livy.spark.sql.maxResult 1000
> zeppelin.livy.url http://correct-path:correct-port
>
> *and have following livy server config:*
> livy.impersonation.enabled = false
> livy.repl.enableHiveContext = true
> ivy.server.csrf_protection.enabled = true
> livy.spark.master = yarn.cluster
>
> *Trying to run paragraph drives me to:*
>
> INFO [2017-06-26 11:39:22,036] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state shutting_down, appId application_1498447942743_0011
> INFO [2017-06-26 11:39:23,184] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:204) - Session 3 is in state dead, appId application_1498447942743_0011
> ERROR [2017-06-26 11:39:23,185] ({pool-2-thread-2} BaseLivyInterprereter.java[createSession]:214) - Error when creating livy session for user anonymous
> org.apache.zeppelin.livy.LivyException: Session 3 is finished, appId: application_1498447942743_0011, log: [YARN Diagnostics:, AM container is launched, waiting for AM container to Register with RM]
> at org.apache.zeppelin.livy.BaseLivyInterprereter.createSession(BaseLivyInterprereter.java:209)
> at org.apache.zeppelin.livy.BaseLivyInterprereter.initLivySession(BaseLivyInterprereter.java:98)
> at org.apache.zeppelin.livy.BaseLivyInterprereter.open(BaseLivyInterprereter.java:80)
> at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:70)
> at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:483)
> at org.apache.zeppelin.scheduler.Job.run(Job.java:175)
> at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
> at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
> at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
> at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
> at java.lang.Thread.run(Thread.java:745)
>
> Could anyone please help to understand what am I doing wrong ?
> Thanks
> --
> Ivan Shapovalov
> Kharkov, Ukraine
>
>