You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by ๏̯͡๏ <ÐΞ€ρ@Ҝ>,
de...@gmail.com on 2015/09/25 04:20:53 UTC
Git Zeppelin on Spark 1.5.0
I am unable to get zeppelin to run on Spark 1.5.0
I have the latest code from git for zeppelin..
Error:
Driver failed to launch
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/hadoop/yarn/local/usercache/zeppelin/filecache/18/spark-assembly-1.5.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:/usr/hdp/2.3.1.0-2574/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
15/09/24 19:18:29 INFO yarn.ApplicationMaster: Registered signal
handlers for [TERM, HUP, INT]
Unknown/unsupported param List(--num-executors, 18)
Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options]
Options:
--jar JAR_PATH Path to your application's JAR file
--class CLASS_NAME Name of your application's main class
--primary-py-file A main Python file
--primary-r-file A main R file
--py-files PY_FILES Comma-separated list of .zip, .egg, or .py files to
place on the PYTHONPATH for Python apps.
--args ARGS Arguments to be passed to your application's main class.
Multiple invocations are possible, each will be
passed in order.
--num-executors NUM Number of executors to start (Default: 2)
--executor-cores NUM Number of cores for the executors (Default: 1)
--executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G)
Log Type: stdout
Log Upload Time: Thu Sep 24 19:18:30 -0700 2015
Log Length: 0
--
Deepak
Re: Git Zeppelin on Spark 1.5.0
Posted by Randy Gelhausen <rg...@gmail.com>.
Here are the steps I use to build and run on HDP 2.3.0:
https://gist.github.com/randerzander/5c6ca7bdd06876c9b247
Specifically, you need to set spark.driver.extraJavaOptions and
spark.yarn.am.extraJavaOptions, else YARN will reject your application
launch.
On Thu, Sep 24, 2015 at 10:20 PM, ÐΞ€ρ@Ҝ (๏̯͡๏) <de...@gmail.com> wrote:
> I am unable to get zeppelin to run on Spark 1.5.0
> I have the latest code from git for zeppelin..
>
>
> Error:
> Driver failed to launch
>
> SLF4J: Class path contains multiple SLF4J bindings.
> SLF4J: Found binding in [jar:file:/hadoop/yarn/local/usercache/zeppelin/filecache/18/spark-assembly-1.5.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: Found binding in [jar:file:/usr/hdp/2.3.1.0-2574/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
> SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
> 15/09/24 19:18:29 INFO yarn.ApplicationMaster: Registered signal handlers for [TERM, HUP, INT]
> Unknown/unsupported param List(--num-executors, 18)
>
> Usage: org.apache.spark.deploy.yarn.ApplicationMaster [options]
> Options:
> --jar JAR_PATH Path to your application's JAR file
> --class CLASS_NAME Name of your application's main class
> --primary-py-file A main Python file
> --primary-r-file A main R file
> --py-files PY_FILES Comma-separated list of .zip, .egg, or .py files to
> place on the PYTHONPATH for Python apps.
> --args ARGS Arguments to be passed to your application's main class.
> Multiple invocations are possible, each will be passed in order.
> --num-executors NUM Number of executors to start (Default: 2)
> --executor-cores NUM Number of cores for the executors (Default: 1)
> --executor-memory MEM Memory per executor (e.g. 1000M, 2G) (Default: 1G)
>
>
> Log Type: stdout
>
> Log Upload Time: Thu Sep 24 19:18:30 -0700 2015
>
> Log Length: 0
>
> --
> Deepak
>
>