You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by Mich Talebzadeh <mi...@peridale.co.uk> on 2015/12/04 12:05:39 UTC

FW: Getting error when trying to start master node after building spark 1.3

I sent this one to Spark user group but no response

 

 

Hi,

 

 

I am trying to make Hive work with Spark.

 

I have been told that I need to use Spark 1.3 and build it from source code
WITHOUT HIVE libraries.

 

I have built it as follows:

 

./make-distribution.sh --name "hadoop2-without-hive" --tgz
"-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"

 

 

Now the issue I have that I cannot start master node which I think I need it
to make it work with Hive on Spark!

 

When I try

 

hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin
<mailto:hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin>
> ./start-master.sh

starting org.apache.spark.deploy.master.Master, logging to
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.
apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

full log in
/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.
apache.spark.deploy.master.Master-1-rhes564.out

 

I get

 

Spark Command: /usr/java/latest/bin/java -cp
:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1
.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home
/hduser/hadoop-2.6.0/etc/hadoop -XX:MaxPermSize=128m
-Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077
--webui-port 8080

========================================

 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

        at java.lang.Class.getMethod0(Class.java:2764)

        at java.lang.Class.getMethod(Class.java:1653)

        at
sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

        at
sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

 

Any advice will be appreciated.

 

Thanks,

 

Mich

 

 

NOTE: The information in this email is proprietary and confidential. This
message is for the designated recipient only, if you are not the intended
recipient, you should destroy it immediately. Any information in this
message shall not be understood as given or endorsed by Peridale Technology
Ltd, its subsidiaries or their employees, unless expressly so stated. It is
the responsibility of the recipient to ensure that this email is virus free,
therefore neither Peridale Ltd, its subsidiaries nor their employees accept
any responsibility.

 


Re: FW: Getting error when trying to start master node after building spark 1.3

Posted by Xuefu Zhang <xz...@cloudera.com>.
1.3.1 is what officially supported by Hive 1.2.1. 1.3.0 might be okay too.

On Fri, Dec 4, 2015 at 9:34 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> Appreciated the response. Just to clarify the build will be spark 1.3 and
> the pre-build download will be 1.3. this is the version I am attempting to
> make it work.
>
>
>
> Thanks
>
>
>
> Mich
>
>
>
> *From:* Xuefu Zhang [mailto:xzhang@cloudera.com]
> *Sent:* 04 December 2015 17:03
> *To:* user@hive.apache.org
> *Subject:* Re: FW: Getting error when trying to start master node after
> building spark 1.3
>
>
>
> My last attempt:
>
> 1. Make sure the spark-assembly.jar from your own build doesn't contain
> hive classes, using "jar -tf spark-assembly.jar | grep hive" command. Copy
> it to Hive's /lib directory. After this, you can forget everything about
> this build.
>
> 2. Download prebuilt tarball from Spark download site and deploy it.
> Forget about Hive for a moment. Make sure the cluster comes up and
> functions.
>
> 3. Unset environment variable SPARK_HOME before you start Hive. Start
> Hive, and run "set spark.home=/path/to/spark/dir" command. Then run other
> commands as you did previously when trying hive on spark.
>
>
>
>
>
> On Fri, Dec 4, 2015 at 3:05 AM, Mich Talebzadeh <mi...@peridale.co.uk>
> wrote:
>
> I sent this one to Spark user group but no response
>
>
>
>
>
> Hi,
>
>
>
>
>
> I am trying to make Hive work with Spark.
>
>
>
> I have been told that I need to use Spark 1.3 and build it from source
> code WITHOUT HIVE libraries.
>
>
>
> I have built it as follows:
>
>
>
> ./make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"
>
>
>
>
>
> Now the issue I have that I cannot start master node which I think I need
> it to make it work with Hive on Spark!
>
>
>
> When I try
>
>
>
> hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin>
> ./start-master.sh
>
> starting org.apache.spark.deploy.master.Master, logging to
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
> failed to launch org.apache.spark.deploy.master.Master:
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>         ... 6 more
>
> full log in
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
>
>
> I get
>
>
>
> Spark Command: /usr/java/latest/bin/java -cp
> :/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077
> --webui-port 8080
>
> ========================================
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
>
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)
>
>         at java.lang.Class.getMethod0(Class.java:2764)
>
>         at java.lang.Class.getMethod(Class.java:1653)
>
>         at
> sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
>
>         at
> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
>
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>         ... 6 more
>
>
>
> Any advice will be appreciated.
>
>
>
> Thanks,
>
>
>
> Mich
>
>
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>
>
>

RE: FW: Getting error when trying to start master node after building spark 1.3

Posted by Mich Talebzadeh <mi...@peridale.co.uk>.
Appreciated the response. Just to clarify the build will be spark 1.3 and the pre-build download will be 1.3. this is the version I am attempting to make it work.

 

Thanks

 

Mich

 

From: Xuefu Zhang [mailto:xzhang@cloudera.com] 
Sent: 04 December 2015 17:03
To: user@hive.apache.org
Subject: Re: FW: Getting error when trying to start master node after building spark 1.3

 

My last attempt:

1. Make sure the spark-assembly.jar from your own build doesn't contain hive classes, using "jar -tf spark-assembly.jar | grep hive" command. Copy it to Hive's /lib directory. After this, you can forget everything about this build.

2. Download prebuilt tarball from Spark download site and deploy it. Forget about Hive for a moment. Make sure the cluster comes up and functions.

3. Unset environment variable SPARK_HOME before you start Hive. Start Hive, and run "set spark.home=/path/to/spark/dir" command. Then run other commands as you did previously when trying hive on spark.

 

 

On Fri, Dec 4, 2015 at 3:05 AM, Mich Talebzadeh <mich@peridale.co.uk <ma...@peridale.co.uk> > wrote:

I sent this one to Spark user group but no response

 

 

Hi,

 

 

I am trying to make Hive work with Spark.

 

I have been told that I need to use Spark 1.3 and build it from source code WITHOUT HIVE libraries.

 

I have built it as follows:

 

./make-distribution.sh --name "hadoop2-without-hive" --tgz "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"

 

 

Now the issue I have that I cannot start master node which I think I need it to make it work with Hive on Spark!

 

When I try

 

hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin <mailto:hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin> > ./start-master.sh

starting org.apache.spark.deploy.master.Master, logging to /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

failed to launch org.apache.spark.deploy.master.Master:

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

full log in /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out

 

I get

 

Spark Command: /usr/java/latest/bin/java -cp :/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077 --webui-port 8080

========================================

 

Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger

        at java.lang.Class.getDeclaredMethods0(Native Method)

        at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)

        at java.lang.Class.getMethod0(Class.java:2764)

        at java.lang.Class.getMethod(Class.java:1653)

        at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)

        at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)

Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger

        at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

        at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

        at java.security.AccessController.doPrivileged(Native Method)

        at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

        ... 6 more

 

Any advice will be appreciated.

 

Thanks,

 

Mich

 

 

NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility.

 

 


Re: FW: Getting error when trying to start master node after building spark 1.3

Posted by Xuefu Zhang <xz...@cloudera.com>.
My last attempt:

1. Make sure the spark-assembly.jar from your own build doesn't contain
hive classes, using "jar -tf spark-assembly.jar | grep hive" command. Copy
it to Hive's /lib directory. After this, you can forget everything about
this build.

2. Download prebuilt tarball from Spark download site and deploy it. Forget
about Hive for a moment. Make sure the cluster comes up and functions.

3. Unset environment variable SPARK_HOME before you start Hive. Start Hive,
and run "set spark.home=/path/to/spark/dir" command. Then run other
commands as you did previously when trying hive on spark.


On Fri, Dec 4, 2015 at 3:05 AM, Mich Talebzadeh <mi...@peridale.co.uk> wrote:

> I sent this one to Spark user group but no response
>
>
>
>
>
> Hi,
>
>
>
>
>
> I am trying to make Hive work with Spark.
>
>
>
> I have been told that I need to use Spark 1.3 and build it from source
> code WITHOUT HIVE libraries.
>
>
>
> I have built it as follows:
>
>
>
> ./make-distribution.sh --name "hadoop2-without-hive" --tgz
> "-Pyarn,hadoop-provided,hadoop-2.4,parquet-provided"
>
>
>
>
>
> Now the issue I have that I cannot start master node which I think I need
> it to make it work with Hive on Spark!
>
>
>
> When I try
>
>
>
> hduser@rhes564::/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin>
> ./start-master.sh
>
> starting org.apache.spark.deploy.master.Master, logging to
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
> failed to launch org.apache.spark.deploy.master.Master:
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>         ... 6 more
>
> full log in
> /usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../logs/spark-hduser-org.apache.spark.deploy.master.Master-1-rhes564.out
>
>
>
> I get
>
>
>
> Spark Command: /usr/java/latest/bin/java -cp
> :/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/sbin/../conf:/usr/lib/spark-1.3.0-bin-hadoop2-without-hive/lib/spark-assembly-1.3.0-hadoop2.4.0.jar:/home/hduser/hadoop-2.6.0/etc/hadoop
> -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m
> org.apache.spark.deploy.master.Master --ip 50.140.197.217 --port 7077
> --webui-port 8080
>
> ========================================
>
>
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/slf4j/Logger
>
>         at java.lang.Class.getDeclaredMethods0(Native Method)
>
>         at java.lang.Class.privateGetDeclaredMethods(Class.java:2521)
>
>         at java.lang.Class.getMethod0(Class.java:2764)
>
>         at java.lang.Class.getMethod(Class.java:1653)
>
>         at
> sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
>
>         at
> sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
>
> Caused by: java.lang.ClassNotFoundException: org.slf4j.Logger
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
>         at java.security.AccessController.doPrivileged(Native Method)
>
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>         ... 6 more
>
>
>
> Any advice will be appreciated.
>
>
>
> Thanks,
>
>
>
> Mich
>
>
>
>
>
> NOTE: The information in this email is proprietary and confidential. This
> message is for the designated recipient only, if you are not the intended
> recipient, you should destroy it immediately. Any information in this
> message shall not be understood as given or endorsed by Peridale Technology
> Ltd, its subsidiaries or their employees, unless expressly so stated. It is
> the responsibility of the recipient to ensure that this email is virus
> free, therefore neither Peridale Ltd, its subsidiaries nor their employees
> accept any responsibility.
>
>
>