You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Kapil Malik <km...@adobe.com> on 2013/11/12 11:57:14 UTC

shark-shell not launching in cluster

Hi all,

I've a standalone spark + shark cluster, following steps on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
$SPARK_HOME/spark-shell works fine. I am able to access files, and perform operations.
$SHARK_HOME/bin/shark also works fine, I am able to access hive tables and perform operations.

However,
$SHARK_HOME/bin/shark-shell does not work :(

Logs -
Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
Initializing interpreter...
error: error while loading <root>, error in opening zip file
Failed to initialize compiler: object scala not found.

I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.

This failure restricts me to use sql2rdd operations. Can you please suggests any steps to trouble shoot this OR make sql2rdd work with spark-shell (if not shark-shell)

Any suggestions ?

Thanks and regards,

Kapil Malik | kmalik@adobe.com<ma...@adobe.com>



RE: shark-shell not launching in cluster

Posted by Kapil Malik <km...@adobe.com>.
Hi all,

Thanks for suggestions.

@Andre,
Yes I'd checked all the binaries and also (spark/shark)-env.sh on all nodes.

@Matei,
I followed steps from https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster So I had not built Spark / Shark on my machine, but was using pre-built binaries.
I ran an sbt clean and sbt assembly for Spark (with due changes for CDH 4.4) for all nodes. Spark cluster loads up fine as before.

For Shark,when I run :
sudo SHARK_HADOOP_VERSION=2.0.0-cdh4.4.0 SHARK_YARN=true sbt/sbt assembly,
It gives me an exception -

[info] Compiling 96 Scala sources and 5 Java sources to /home/<>/shark-0.8.0-bin-cdh4/shark-0.8.0/target/scala-2.9.3/classes...
[error] error while loading <root>, error in opening zip file
scala.tools.nsc.MissingRequirementError: object scala not found.
...
...
This is similar to the error which I saw when running shark-shell.
I'm on Ubuntu 12.0.4 and have set JDK 7 as JAVA_HOME and also in PATH variable.

Now I'm not even able to run ./shark (i.e. hive console) as my shark installation is corrupted :)
I will re-sync the pre-built libraries for that. But still the original problem remains.

Regards,

Kapil Malik | kmalik@adobe.com<ma...@adobe.com>


From: Matei Zaharia [mailto:matei.zaharia@gmail.com]
Sent: 13 November 2013 09:01
To: user@spark.incubator.apache.org<ma...@spark.incubator.apache.org>
Subject: Re: shark-shell not launching in cluster

It might mean one of your JARs is corrupted. Try doing sbt clean and then sbt assembly again.

Matei

On Nov 12, 2013, at 10:48 AM, Josh Rosen <ro...@gmail.com>> wrote:


I've seen this "error: error while loading <root>, error in opening zip file" before, but I'm not exactly sure what causes it.  Here's a JIRA discussing that error in earlier versions of Spark: https://spark-project.atlassian.net/browse/SPARK-692

On Tue, Nov 12, 2013 at 10:44 AM, Andre Schumacher <sc...@icsi.berkeley.edu>> wrote:

Hi,

Have you tried it in local mode?

The error message seems to indicate problems with the classpath. So you
may want to make sure that all the binaries are present on all nodes at
the same location (in addition making sure the {spark,shark}-env.sh
settings are correct on all nodes).

Andre

On 11/12/2013 02:57 AM, Kapil Malik wrote:
> Hi all,
>
> I've a standalone spark + shark cluster, following steps on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
> $SPARK_HOME/spark-shell works fine. I am able to access files, and perform operations.
> $SHARK_HOME/bin/shark also works fine, I am able to access hive tables and perform operations.
>
> However,
> $SHARK_HOME/bin/shark-shell does not work :(
>
> Logs -
> Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
> Initializing interpreter...
> error: error while loading <root>, error in opening zip file
> Failed to initialize compiler: object scala not found.
>
> I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.
>
> This failure restricts me to use sql2rdd operations. Can you please suggests any steps to trouble shoot this OR make sql2rdd work with spark-shell (if not shark-shell)
>
> Any suggestions ?
>
> Thanks and regards,
>
> Kapil Malik | kmalik@adobe.com<ma...@adobe.com>>
>
>
>



Re: shark-shell not launching in cluster

Posted by Matei Zaharia <ma...@gmail.com>.
It might mean one of your JARs is corrupted. Try doing sbt clean and then sbt assembly again.

Matei

On Nov 12, 2013, at 10:48 AM, Josh Rosen <ro...@gmail.com> wrote:

> I've seen this "error: error while loading <root>, error in opening zip file" before, but I'm not exactly sure what causes it.  Here's a JIRA discussing that error in earlier versions of Spark: https://spark-project.atlassian.net/browse/SPARK-692
> 
> 
> On Tue, Nov 12, 2013 at 10:44 AM, Andre Schumacher <sc...@icsi.berkeley.edu> wrote:
> 
> Hi,
> 
> Have you tried it in local mode?
> 
> The error message seems to indicate problems with the classpath. So you
> may want to make sure that all the binaries are present on all nodes at
> the same location (in addition making sure the {spark,shark}-env.sh
> settings are correct on all nodes).
> 
> Andre
> 
> On 11/12/2013 02:57 AM, Kapil Malik wrote:
> > Hi all,
> >
> > I've a standalone spark + shark cluster, following steps on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
> > $SPARK_HOME/spark-shell works fine. I am able to access files, and perform operations.
> > $SHARK_HOME/bin/shark also works fine, I am able to access hive tables and perform operations.
> >
> > However,
> > $SHARK_HOME/bin/shark-shell does not work :(
> >
> > Logs -
> > Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
> > Initializing interpreter...
> > error: error while loading <root>, error in opening zip file
> > Failed to initialize compiler: object scala not found.
> >
> > I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.
> >
> > This failure restricts me to use sql2rdd operations. Can you please suggests any steps to trouble shoot this OR make sql2rdd work with spark-shell (if not shark-shell)
> >
> > Any suggestions ?
> >
> > Thanks and regards,
> >
> > Kapil Malik | kmalik@adobe.com<ma...@adobe.com>
> >
> >
> >
> 
> 


Re: shark-shell not launching in cluster

Posted by Josh Rosen <ro...@gmail.com>.
I've seen this "error: error while loading <root>, error in opening zip
file" before, but I'm not exactly sure what causes it.  Here's a JIRA
discussing that error in earlier versions of Spark:
https://spark-project.atlassian.net/browse/SPARK-692


On Tue, Nov 12, 2013 at 10:44 AM, Andre Schumacher <
schumach@icsi.berkeley.edu> wrote:

>
> Hi,
>
> Have you tried it in local mode?
>
> The error message seems to indicate problems with the classpath. So you
> may want to make sure that all the binaries are present on all nodes at
> the same location (in addition making sure the {spark,shark}-env.sh
> settings are correct on all nodes).
>
> Andre
>
> On 11/12/2013 02:57 AM, Kapil Malik wrote:
> > Hi all,
> >
> > I've a standalone spark + shark cluster, following steps on
> https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
> > $SPARK_HOME/spark-shell works fine. I am able to access files, and
> perform operations.
> > $SHARK_HOME/bin/shark also works fine, I am able to access hive tables
> and perform operations.
> >
> > However,
> > $SHARK_HOME/bin/shark-shell does not work :(
> >
> > Logs -
> > Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
> > Initializing interpreter...
> > error: error while loading <root>, error in opening zip file
> > Failed to initialize compiler: object scala not found.
> >
> > I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.
> >
> > This failure restricts me to use sql2rdd operations. Can you please
> suggests any steps to trouble shoot this OR make sql2rdd work with
> spark-shell (if not shark-shell)
> >
> > Any suggestions ?
> >
> > Thanks and regards,
> >
> > Kapil Malik | kmalik@adobe.com<ma...@adobe.com>
> >
> >
> >
>
>

Re: shark-shell not launching in cluster

Posted by Andre Schumacher <sc...@icsi.berkeley.edu>.
Hi,

Have you tried it in local mode?

The error message seems to indicate problems with the classpath. So you
may want to make sure that all the binaries are present on all nodes at
the same location (in addition making sure the {spark,shark}-env.sh
settings are correct on all nodes).

Andre

On 11/12/2013 02:57 AM, Kapil Malik wrote:
> Hi all,
> 
> I've a standalone spark + shark cluster, following steps on https://github.com/amplab/shark/wiki/Running-Shark-on-a-Cluster
> $SPARK_HOME/spark-shell works fine. I am able to access files, and perform operations.
> $SHARK_HOME/bin/shark also works fine, I am able to access hive tables and perform operations.
> 
> However,
> $SHARK_HOME/bin/shark-shell does not work :(
> 
> Logs -
> Using Scala version 2.9.3 (OpenJDK 64-Bit Server VM, Java 1.6.0_27)
> Initializing interpreter...
> error: error while loading <root>, error in opening zip file
> Failed to initialize compiler: object scala not found.
> 
> I've set SCALA_HOME correctly in shark-env.sh and also in spark-env.sh.
> 
> This failure restricts me to use sql2rdd operations. Can you please suggests any steps to trouble shoot this OR make sql2rdd work with spark-shell (if not shark-shell)
> 
> Any suggestions ?
> 
> Thanks and regards,
> 
> Kapil Malik | kmalik@adobe.com<ma...@adobe.com>
> 
> 
>