You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@zeppelin.apache.org by Corneau Damien <co...@gmail.com> on 2015/09/07 11:32:37 UTC

Error when trying to use spark interpreter

I'm always building using: mvn clean package -Pspark-1.3
-Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests

However today, I haven't been able to run anything with the spark
interpreter.


Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
scala/tools/nsc/settings/AbsSettings$AbsSetting

at java.lang.Class.forName0(Native Method)

at java.lang.Class.forName(Class.java:190)

at
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)

at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)

at
org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)

at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)

at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)

at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.ClassNotFoundException:
scala.tools.nsc.settings.AbsSettings$AbsSetting

at java.net.URLClassLoader$1.run(URLClassLoader.java:366)

at java.net.URLClassLoader$1.run(URLClassLoader.java:355)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:354)

at java.lang.ClassLoader.loadClass(ClassLoader.java:425)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)

at java.lang.ClassLoader.loadClass(ClassLoader.java:358)

... 11 more

Re: Error when trying to use spark interpreter

Posted by Corneau Damien <co...@gmail.com>.
Alright, just fixed it.
Found out that I had old ENV variables set for HADOOP_HOME and SPARK_HOME
in my .bash_profile.

On Tue, Sep 8, 2015 at 12:17 PM, Amos B. Elberg <am...@gmail.com>
wrote:

> Did you use a prebuilt spark or compile it yourself?
>
> > On Sep 7, 2015, at 10:34 PM, Corneau Damien <co...@gmail.com>
> wrote:
> >
> > I made a git clean -dxf to have a clean repository, I even deleted the
> .m2
> > folder.
> > I'm not using any configuration, and also tried with spark 1.4 but still
> > getting the same error.
> >
> > This is the files I have in the interpreter/spark/dep directory:
> >
> > datanucleus-api-jdo-3.2.6.jar
> >
> > datanucleus-rdbms-3.2.9.jar
> >
> > datanucleus-core-3.2.10.jar
> >
> > zeppelin-spark-dependencies-0.6.0-incubating-SNAPSHOT.jar
> >
> > On Tue, Sep 8, 2015 at 12:22 AM, Amos B. Elberg <am...@gmail.com>
> > wrote:
> >
> >> Guys this is very similar to the issues I was having since we changed to
> >> the "spark dependencies" module configuration.
> >>
> >> To solve it, I had to switch to the spark-submit system. I also had to
> >> recompile spark, since apparently some build configurations (and those
> with
> >> Hadoop provided) don't put some classes where Zeppelin now expects.
> >> Ultimately, I had to wipe out the spark and Zeppelin directories
> completely
> >> and rebuild from scratch.
> >>
> >> That said, I continue to feel like there's something going on with
> either
> >> the build process or class loader or both with that spark-dependency
> >> change, that we just don't understand yet.
> >>
> >>> On Sep 7, 2015, at 11:00 AM, moon soo Lee <mo...@apache.org> wrote:
> >>>
> >>> I just tried the current master branch
> >>> (0b47cde4410bd52b0a9ec070fda15a22da96980d)
> >>> with your build command and i couldn't see that error.
> >>>
> >>> Can you verify you have files under interpreter/spark/dep directory
> after
> >>> the build?
> >>>
> >>> Best,
> >>> moon
> >>>
> >>>> On Mon, Sep 7, 2015 at 2:32 AM Corneau Damien <co...@gmail.com>
> >> wrote:
> >>>>
> >>>> I'm always building using: mvn clean package -Pspark-1.3
> >>>> -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
> >>>>
> >>>> However today, I haven't been able to run anything with the spark
> >>>> interpreter.
> >>>>
> >>>>
> >>>> Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
> >>>> scala/tools/nsc/settings/AbsSettings$AbsSetting
> >>>>
> >>>> at java.lang.Class.forName0(Native Method)
> >>>>
> >>>> at java.lang.Class.forName(Class.java:190)
> >>>>
> >>>> at
> >>
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)
> >>>>
> >>>> at
> >>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)
> >>>>
> >>>> at
> >>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)
> >>>>
> >>>> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> >>>>
> >>>> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> >>>>
> >>>> at
> >>
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
> >>>>
> >>>> at
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >>>>
> >>>> at
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >>>>
> >>>> at java.lang.Thread.run(Thread.java:745)
> >>>>
> >>>> Caused by: java.lang.ClassNotFoundException:
> >>>> scala.tools.nsc.settings.AbsSettings$AbsSetting
> >>>>
> >>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>>>
> >>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>>>
> >>>> at java.security.AccessController.doPrivileged(Native Method)
> >>>>
> >>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>>>
> >>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>>>
> >>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>>>
> >>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>>>
> >>>> ... 11 more
> >>
>

Re: Error when trying to use spark interpreter

Posted by "Amos B. Elberg" <am...@gmail.com>.
Did you use a prebuilt spark or compile it yourself? 

> On Sep 7, 2015, at 10:34 PM, Corneau Damien <co...@gmail.com> wrote:
> 
> I made a git clean -dxf to have a clean repository, I even deleted the .m2
> folder.
> I'm not using any configuration, and also tried with spark 1.4 but still
> getting the same error.
> 
> This is the files I have in the interpreter/spark/dep directory:
> 
> datanucleus-api-jdo-3.2.6.jar
> 
> datanucleus-rdbms-3.2.9.jar
> 
> datanucleus-core-3.2.10.jar
> 
> zeppelin-spark-dependencies-0.6.0-incubating-SNAPSHOT.jar
> 
> On Tue, Sep 8, 2015 at 12:22 AM, Amos B. Elberg <am...@gmail.com>
> wrote:
> 
>> Guys this is very similar to the issues I was having since we changed to
>> the "spark dependencies" module configuration.
>> 
>> To solve it, I had to switch to the spark-submit system. I also had to
>> recompile spark, since apparently some build configurations (and those with
>> Hadoop provided) don't put some classes where Zeppelin now expects.
>> Ultimately, I had to wipe out the spark and Zeppelin directories completely
>> and rebuild from scratch.
>> 
>> That said, I continue to feel like there's something going on with either
>> the build process or class loader or both with that spark-dependency
>> change, that we just don't understand yet.
>> 
>>> On Sep 7, 2015, at 11:00 AM, moon soo Lee <mo...@apache.org> wrote:
>>> 
>>> I just tried the current master branch
>>> (0b47cde4410bd52b0a9ec070fda15a22da96980d)
>>> with your build command and i couldn't see that error.
>>> 
>>> Can you verify you have files under interpreter/spark/dep directory after
>>> the build?
>>> 
>>> Best,
>>> moon
>>> 
>>>> On Mon, Sep 7, 2015 at 2:32 AM Corneau Damien <co...@gmail.com>
>> wrote:
>>>> 
>>>> I'm always building using: mvn clean package -Pspark-1.3
>>>> -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
>>>> 
>>>> However today, I haven't been able to run anything with the spark
>>>> interpreter.
>>>> 
>>>> 
>>>> Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
>>>> scala/tools/nsc/settings/AbsSettings$AbsSetting
>>>> 
>>>> at java.lang.Class.forName0(Native Method)
>>>> 
>>>> at java.lang.Class.forName(Class.java:190)
>>>> 
>>>> at
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)
>>>> 
>>>> at
>> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)
>>>> 
>>>> at
>> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)
>>>> 
>>>> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>>>> 
>>>> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>>>> 
>>>> at
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>>>> 
>>>> at
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>>>> 
>>>> at
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>>>> 
>>>> at java.lang.Thread.run(Thread.java:745)
>>>> 
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> scala.tools.nsc.settings.AbsSettings$AbsSetting
>>>> 
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>>>> 
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>>>> 
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>> 
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>>>> 
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>>>> 
>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>>>> 
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>>>> 
>>>> ... 11 more
>> 

Re: Error when trying to use spark interpreter

Posted by Corneau Damien <co...@gmail.com>.
I made a git clean -dxf to have a clean repository, I even deleted the .m2
folder.
I'm not using any configuration, and also tried with spark 1.4 but still
getting the same error.

This is the files I have in the interpreter/spark/dep directory:

datanucleus-api-jdo-3.2.6.jar

datanucleus-rdbms-3.2.9.jar

datanucleus-core-3.2.10.jar

zeppelin-spark-dependencies-0.6.0-incubating-SNAPSHOT.jar

On Tue, Sep 8, 2015 at 12:22 AM, Amos B. Elberg <am...@gmail.com>
wrote:

> Guys this is very similar to the issues I was having since we changed to
> the "spark dependencies" module configuration.
>
> To solve it, I had to switch to the spark-submit system. I also had to
> recompile spark, since apparently some build configurations (and those with
> Hadoop provided) don't put some classes where Zeppelin now expects.
> Ultimately, I had to wipe out the spark and Zeppelin directories completely
> and rebuild from scratch.
>
> That said, I continue to feel like there's something going on with either
> the build process or class loader or both with that spark-dependency
> change, that we just don't understand yet.
>
> > On Sep 7, 2015, at 11:00 AM, moon soo Lee <mo...@apache.org> wrote:
> >
> > I just tried the current master branch
> > (0b47cde4410bd52b0a9ec070fda15a22da96980d)
> > with your build command and i couldn't see that error.
> >
> > Can you verify you have files under interpreter/spark/dep directory after
> > the build?
> >
> > Best,
> > moon
> >
> >> On Mon, Sep 7, 2015 at 2:32 AM Corneau Damien <co...@gmail.com>
> wrote:
> >>
> >> I'm always building using: mvn clean package -Pspark-1.3
> >> -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
> >>
> >> However today, I haven't been able to run anything with the spark
> >> interpreter.
> >>
> >>
> >> Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
> >> scala/tools/nsc/settings/AbsSettings$AbsSetting
> >>
> >> at java.lang.Class.forName0(Native Method)
> >>
> >> at java.lang.Class.forName(Class.java:190)
> >>
> >> at
> >>
> >>
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)
> >>
> >> at
> >>
> >>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)
> >>
> >> at
> >>
> >>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)
> >>
> >> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
> >>
> >> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
> >>
> >> at
> >>
> >>
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
> >>
> >> at
> >>
> >>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
> >>
> >> at
> >>
> >>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
> >>
> >> at java.lang.Thread.run(Thread.java:745)
> >>
> >> Caused by: java.lang.ClassNotFoundException:
> >> scala.tools.nsc.settings.AbsSettings$AbsSetting
> >>
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> >>
> >> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> >>
> >> at java.security.AccessController.doPrivileged(Native Method)
> >>
> >> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> >>
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> >>
> >> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
> >>
> >> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> >>
> >> ... 11 more
> >>
>

Re: Error when trying to use spark interpreter

Posted by "Amos B. Elberg" <am...@gmail.com>.
Guys this is very similar to the issues I was having since we changed to the "spark dependencies" module configuration. 

To solve it, I had to switch to the spark-submit system. I also had to recompile spark, since apparently some build configurations (and those with Hadoop provided) don't put some classes where Zeppelin now expects. Ultimately, I had to wipe out the spark and Zeppelin directories completely and rebuild from scratch.

That said, I continue to feel like there's something going on with either the build process or class loader or both with that spark-dependency change, that we just don't understand yet.

> On Sep 7, 2015, at 11:00 AM, moon soo Lee <mo...@apache.org> wrote:
> 
> I just tried the current master branch
> (0b47cde4410bd52b0a9ec070fda15a22da96980d)
> with your build command and i couldn't see that error.
> 
> Can you verify you have files under interpreter/spark/dep directory after
> the build?
> 
> Best,
> moon
> 
>> On Mon, Sep 7, 2015 at 2:32 AM Corneau Damien <co...@gmail.com> wrote:
>> 
>> I'm always building using: mvn clean package -Pspark-1.3
>> -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
>> 
>> However today, I haven't been able to run anything with the spark
>> interpreter.
>> 
>> 
>> Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
>> scala/tools/nsc/settings/AbsSettings$AbsSetting
>> 
>> at java.lang.Class.forName0(Native Method)
>> 
>> at java.lang.Class.forName(Class.java:190)
>> 
>> at
>> 
>> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)
>> 
>> at
>> 
>> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)
>> 
>> at
>> 
>> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)
>> 
>> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>> 
>> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>> 
>> at
>> 
>> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>> 
>> at
>> 
>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>> 
>> at
>> 
>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>> 
>> at java.lang.Thread.run(Thread.java:745)
>> 
>> Caused by: java.lang.ClassNotFoundException:
>> scala.tools.nsc.settings.AbsSettings$AbsSetting
>> 
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>> 
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>> 
>> at java.security.AccessController.doPrivileged(Native Method)
>> 
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>> 
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>> 
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>> 
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>> 
>> ... 11 more
>> 

Re: Error when trying to use spark interpreter

Posted by moon soo Lee <mo...@apache.org>.
I just tried the current master branch
(0b47cde4410bd52b0a9ec070fda15a22da96980d)
with your build command and i couldn't see that error.

Can you verify you have files under interpreter/spark/dep directory after
the build?

Best,
moon

On Mon, Sep 7, 2015 at 2:32 AM Corneau Damien <co...@gmail.com> wrote:

> I'm always building using: mvn clean package -Pspark-1.3
> -Dhadoop.version=2.2.0 -Phadoop-2.2 -DskipTests
>
> However today, I haven't been able to run anything with the spark
> interpreter.
>
>
> Exception in thread "pool-1-thread-2" java.lang.NoClassDefFoundError:
> scala/tools/nsc/settings/AbsSettings$AbsSetting
>
> at java.lang.Class.forName0(Native Method)
>
> at java.lang.Class.forName(Class.java:190)
>
> at
>
> org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer.createInterpreter(RemoteInterpreterServer.java:136)
>
> at
>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:990)
>
> at
>
> org.apache.zeppelin.interpreter.thrift.RemoteInterpreterService$Processor$createInterpreter.getResult(RemoteInterpreterService.java:975)
>
> at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>
> at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>
> at
>
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
>
> at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>
> at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>
> at java.lang.Thread.run(Thread.java:745)
>
> Caused by: java.lang.ClassNotFoundException:
> scala.tools.nsc.settings.AbsSettings$AbsSetting
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
>
> ... 11 more
>