You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@zeppelin.apache.org by Jeremy Cunningham <je...@statefarm.com> on 2016/09/07 19:18:38 UTC

java.lang.ClassNotFoundException: org.apache.spark.repl.SparkCommandLine

I have Zeppelin 0.6.0 running on a linux server, this server is also an edge node to my Hadoop cluster.  I have a Hadoop cluster with spark 1.5 running on it.  By pointing the spark master to yarn-client, I can run spark in zeppelin without a problem.  I also have spark 2.0 unpacked in my home directory and can navigate to the bin/spark-shell in my home directory and run spark 2.0 from the repl with –master yarn-client . That works fine.  When I try to point my spark-home to my home directory so that zeppelin will run spark 2.0, I get the error message above.  Any suggestions on what I am doing wrong?  I do not have rights to simply upgrade the cluster to spark 2.0.

Thanks,
Jeremy

Re: java.lang.ClassNotFoundException: org.apache.spark.repl.SparkCommandLine

Posted by moon soo Lee <mo...@apache.org>.
Zeppelin support spark 2.0 from 0.6.1 release. Check "Available
interpreters" section on download page [1].

Please try 0.6.1 release or build from master branch, and let us know if it
works!

Thanks,
moon


[1] http://zeppelin.apache.org/download.html


On Wed, Sep 7, 2016 at 12:18 PM Jeremy Cunningham <
jeremy.cunningham.hul4@statefarm.com> wrote:

> I have Zeppelin 0.6.0 running on a linux server, this server is also an
> edge node to my Hadoop cluster.  I have a Hadoop cluster with spark 1.5
> running on it.  By pointing the spark master to yarn-client, I can run
> spark in zeppelin without a problem.  I also have spark 2.0 unpacked in my
> home directory and can navigate to the bin/spark-shell in my home directory
> and run spark 2.0 from the repl with –master yarn-client . That works
> fine.  When I try to point my spark-home to my home directory so that
> zeppelin will run spark 2.0, I get the error message above.  Any
> suggestions on what I am doing wrong?  I do not have rights to simply
> upgrade the cluster to spark 2.0.
>
>
>
> Thanks,
>
> Jeremy
>