You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by Artemis User <ar...@dtechspace.com> on 2022/02/17 21:57:56 UTC
java.lang.NoClassDefFoundError: scala/App$class
Hi Toree Dev Team, Could someone please help with resolving the
following error when starting the Toree Scala Kernel inside Jupyter? My
configuration settings:
* JupyterHub version 3.2.9
* OpenJDK 11
* Spark 3.2.0 with Scala version 2.12
* Apache Toree 0.4.0-incubating
Thanks a lot for your help!
Exception in thread "main" java.lang.NoClassDefFoundError: scala/App$class
at org.apache.toree.Main$.<init>(Main.scala:24)
at org.apache.toree.Main$.<clinit>(Main.scala)
at org.apache.toree.Main.main(Main.scala)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
Method)
at
java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at
org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
at
org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
at
org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at
org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: scala.App$class
at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
... 15 more
Re: java.lang.NoClassDefFoundError: scala/App$class
Posted by Artemis User <ar...@dtechspace.com>.
Thanks a lot! I guess the online doc was outdated...
On 2/18/22 5:31 PM, Luciano Resende wrote:
> Toree removed support for PySpark and R kernels.
>
> See:
> https://issues.apache.org/jira/browse/TOREE-487
> https://issues.apache.org/jira/browse/TOREE-488
>
>
> On Fri, Feb 18, 2022 at 11:21 AM Artemis User <ar...@dtechspace.com>
> wrote:
>
>> I was able to install version 0.5.0 and it works great with JupyterHub!
>> Thanks again for the help! However, when I tried to install the PySpark
>> and the SparkR kernel (using the command line option
>> --interpreters=Scala,PySpark,SparkR,SQL), I got the following error
>> messages:
>>
>> [ToreeInstall] ERROR | Unknown interpreter PySpark. Skipping
>> installation of PySpark interpreter
>> [ToreeInstall] ERROR | Unknown interpreter SparkR. Skipping installation
>> of SparkR interpreter
>>
>> Are there two kernels available in 0.5.0?
>>
>>
>> On 2/17/22 8:30 PM, Kevin Bates wrote:
>>> Hello,
>>>
>>> You should checkout v0.5.0-rc5:
>> https://github.com/apache/incubator-toree/releases/tag/v0.5.0-incubating-rc5
>> which includes support for Spark 3.2.
>>> On 2022/02/17 22:27:30 Artemis User wrote:
>>>> After looking at the Toree's Readme on github, I realized that the toree
>>>> version 0.4.x only supports Spark version 2.x, whereas the master branch
>>>> supports Spark 3.2.x. Could someone confirm this? In addition, is a
>>>> distribution package of toree from the master branch available?
>>>>
>>>> Thanks!
>>>>
>>>> On 2/17/22 4:57 PM, Artemis User wrote:
>>>>> Hi Toree Dev Team, Could someone please help with resolving the
>>>>> following error when starting the Toree Scala Kernel inside Jupyter?
>>>>> My configuration settings:
>>>>>
>>>>> * JupyterHub version 3.2.9
>>>>> * OpenJDK 11
>>>>> * Spark 3.2.0 with Scala version 2.12
>>>>> * Apache Toree 0.4.0-incubating
>>>>>
>>>>> Thanks a lot for your help!
>>>>>
>>>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>>>> scala/App$class
>>>>> at org.apache.toree.Main$.<init>(Main.scala:24)
>>>>> at org.apache.toree.Main$.<clinit>(Main.scala)
>>>>> at org.apache.toree.Main.main(Main.scala)
>>>>> at
>>>>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>>>>> Method)
>>>>> at
>>>>>
>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>>>> at
>>>>>
>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>>>> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>>>>> at
>>>>>
>> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>>>>> at
>>>>> org.apache.spark.deploy.SparkSubmit.org
>> $apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
>>>>> at
>>>>> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>>>>> at
>> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>>>>> at
>> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>>>>> at
>>>>>
>> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
>>>>> at
>> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
>>>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>>>> Caused by: java.lang.ClassNotFoundException: scala.App$class
>>>>> at
>>>>> java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
>>>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
>>>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
>>>>> ... 15 more
>>>>>
>>
Re: java.lang.NoClassDefFoundError: scala/App$class
Posted by Luciano Resende <lu...@gmail.com>.
Toree removed support for PySpark and R kernels.
See:
https://issues.apache.org/jira/browse/TOREE-487
https://issues.apache.org/jira/browse/TOREE-488
On Fri, Feb 18, 2022 at 11:21 AM Artemis User <ar...@dtechspace.com>
wrote:
> I was able to install version 0.5.0 and it works great with JupyterHub!
> Thanks again for the help! However, when I tried to install the PySpark
> and the SparkR kernel (using the command line option
> --interpreters=Scala,PySpark,SparkR,SQL), I got the following error
> messages:
>
> [ToreeInstall] ERROR | Unknown interpreter PySpark. Skipping
> installation of PySpark interpreter
> [ToreeInstall] ERROR | Unknown interpreter SparkR. Skipping installation
> of SparkR interpreter
>
> Are there two kernels available in 0.5.0?
>
>
> On 2/17/22 8:30 PM, Kevin Bates wrote:
> > Hello,
> >
> > You should checkout v0.5.0-rc5:
> https://github.com/apache/incubator-toree/releases/tag/v0.5.0-incubating-rc5
> which includes support for Spark 3.2.
> >
> > On 2022/02/17 22:27:30 Artemis User wrote:
> >> After looking at the Toree's Readme on github, I realized that the toree
> >> version 0.4.x only supports Spark version 2.x, whereas the master branch
> >> supports Spark 3.2.x. Could someone confirm this? In addition, is a
> >> distribution package of toree from the master branch available?
> >>
> >> Thanks!
> >>
> >> On 2/17/22 4:57 PM, Artemis User wrote:
> >>> Hi Toree Dev Team, Could someone please help with resolving the
> >>> following error when starting the Toree Scala Kernel inside Jupyter?
> >>> My configuration settings:
> >>>
> >>> * JupyterHub version 3.2.9
> >>> * OpenJDK 11
> >>> * Spark 3.2.0 with Scala version 2.12
> >>> * Apache Toree 0.4.0-incubating
> >>>
> >>> Thanks a lot for your help!
> >>>
> >>> Exception in thread "main" java.lang.NoClassDefFoundError:
> >>> scala/App$class
> >>> at org.apache.toree.Main$.<init>(Main.scala:24)
> >>> at org.apache.toree.Main$.<clinit>(Main.scala)
> >>> at org.apache.toree.Main.main(Main.scala)
> >>> at
> >>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> >>> Method)
> >>> at
> >>>
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> >>> at
> >>>
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> >>> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> >>> at
> >>>
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
> >>> at
> >>> org.apache.spark.deploy.SparkSubmit.org
> $apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
> >>> at
> >>> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
> >>> at
> org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
> >>> at
> org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
> >>> at
> >>>
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
> >>> at
> org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
> >>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> >>> Caused by: java.lang.ClassNotFoundException: scala.App$class
> >>> at
> >>> java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
> >>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
> >>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
> >>> ... 15 more
> >>>
> >>
>
>
--
Luciano Resende
http://twitter.com/lresende1975
http://lresende.blogspot.com/
Re: java.lang.NoClassDefFoundError: scala/App$class
Posted by Artemis User <ar...@dtechspace.com>.
I was able to install version 0.5.0 and it works great with JupyterHub!
Thanks again for the help! However, when I tried to install the PySpark
and the SparkR kernel (using the command line option
--interpreters=Scala,PySpark,SparkR,SQL), I got the following error
messages:
[ToreeInstall] ERROR | Unknown interpreter PySpark. Skipping
installation of PySpark interpreter
[ToreeInstall] ERROR | Unknown interpreter SparkR. Skipping installation
of SparkR interpreter
Are there two kernels available in 0.5.0?
On 2/17/22 8:30 PM, Kevin Bates wrote:
> Hello,
>
> You should checkout v0.5.0-rc5: https://github.com/apache/incubator-toree/releases/tag/v0.5.0-incubating-rc5 which includes support for Spark 3.2.
>
> On 2022/02/17 22:27:30 Artemis User wrote:
>> After looking at the Toree's Readme on github, I realized that the toree
>> version 0.4.x only supports Spark version 2.x, whereas the master branch
>> supports Spark 3.2.x. Could someone confirm this? In addition, is a
>> distribution package of toree from the master branch available?
>>
>> Thanks!
>>
>> On 2/17/22 4:57 PM, Artemis User wrote:
>>> Hi Toree Dev Team, Could someone please help with resolving the
>>> following error when starting the Toree Scala Kernel inside Jupyter?
>>> My configuration settings:
>>>
>>> * JupyterHub version 3.2.9
>>> * OpenJDK 11
>>> * Spark 3.2.0 with Scala version 2.12
>>> * Apache Toree 0.4.0-incubating
>>>
>>> Thanks a lot for your help!
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> scala/App$class
>>> at org.apache.toree.Main$.<init>(Main.scala:24)
>>> at org.apache.toree.Main$.<clinit>(Main.scala)
>>> at org.apache.toree.Main.main(Main.scala)
>>> at
>>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
>>> Method)
>>> at
>>> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
>>> at
>>> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>>> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
>>> at
>>> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
>>> at
>>> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
>>> at
>>> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
>>> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
>>> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
>>> at
>>> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
>>> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
>>> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
>>> Caused by: java.lang.ClassNotFoundException: scala.App$class
>>> at
>>> java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
>>> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
>>> ... 15 more
>>>
>>
Re: java.lang.NoClassDefFoundError: scala/App$class
Posted by Kevin Bates <kb...@apache.org>.
Hello,
You should checkout v0.5.0-rc5: https://github.com/apache/incubator-toree/releases/tag/v0.5.0-incubating-rc5 which includes support for Spark 3.2.
On 2022/02/17 22:27:30 Artemis User wrote:
> After looking at the Toree's Readme on github, I realized that the toree
> version 0.4.x only supports Spark version 2.x, whereas the master branch
> supports Spark 3.2.x. Could someone confirm this? In addition, is a
> distribution package of toree from the master branch available?
>
> Thanks!
>
> On 2/17/22 4:57 PM, Artemis User wrote:
> > Hi Toree Dev Team, Could someone please help with resolving the
> > following error when starting the Toree Scala Kernel inside Jupyter?
> > My configuration settings:
> >
> > * JupyterHub version 3.2.9
> > * OpenJDK 11
> > * Spark 3.2.0 with Scala version 2.12
> > * Apache Toree 0.4.0-incubating
> >
> > Thanks a lot for your help!
> >
> > Exception in thread "main" java.lang.NoClassDefFoundError:
> > scala/App$class
> > at org.apache.toree.Main$.<init>(Main.scala:24)
> > at org.apache.toree.Main$.<clinit>(Main.scala)
> > at org.apache.toree.Main.main(Main.scala)
> > at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> > Method)
> > at
> > java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> > at
> > java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> > at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> > at
> > org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
> > at
> > org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
> > at
> > org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
> > at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
> > at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
> > at
> > org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
> > at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
> > at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> > Caused by: java.lang.ClassNotFoundException: scala.App$class
> > at
> > java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
> > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
> > at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
> > ... 15 more
> >
>
>
Re: java.lang.NoClassDefFoundError: scala/App$class
Posted by Artemis User <ar...@dtechspace.com>.
After looking at the Toree's Readme on github, I realized that the toree
version 0.4.x only supports Spark version 2.x, whereas the master branch
supports Spark 3.2.x. Could someone confirm this? In addition, is a
distribution package of toree from the master branch available?
Thanks!
On 2/17/22 4:57 PM, Artemis User wrote:
> Hi Toree Dev Team, Could someone please help with resolving the
> following error when starting the Toree Scala Kernel inside Jupyter?
> My configuration settings:
>
> * JupyterHub version 3.2.9
> * OpenJDK 11
> * Spark 3.2.0 with Scala version 2.12
> * Apache Toree 0.4.0-incubating
>
> Thanks a lot for your help!
>
> Exception in thread "main" java.lang.NoClassDefFoundError:
> scala/App$class
> at org.apache.toree.Main$.<init>(Main.scala:24)
> at org.apache.toree.Main$.<clinit>(Main.scala)
> at org.apache.toree.Main.main(Main.scala)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native
> Method)
> at
> java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.base/java.lang.reflect.Method.invoke(Method.java:566)
> at
> org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
> at
> org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:955)
> at
> org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
> at
> org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1043)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1052)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> Caused by: java.lang.ClassNotFoundException: scala.App$class
> at
> java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:476)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
> at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)
> ... 15 more
>