You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "Chris Van Houtte (Jira)" <ji...@apache.org> on 2021/06/04 00:00:10 UTC

[jira] [Created] (TOREE-528) Unable to start kernel

Chris Van Houtte created TOREE-528:
--------------------------------------

             Summary: Unable to start kernel
                 Key: TOREE-528
                 URL: https://issues.apache.org/jira/browse/TOREE-528
             Project: TOREE
          Issue Type: Bug
          Components: Kernel
    Affects Versions: 0.4.0
         Environment: Debian 10
            Reporter: Chris Van Houtte


Hi there
 I cannot start the kernel on Debian 10
 $ scala -version
 Scala code runner version 2.11.12 – Copyright 2002-2017, LAMP/EPFL

Followed the instructions here, in an anaconda environment
 [https://toree.apache.org/docs/current/user/installation/]
 The only deviation being adding the --user argument as follows
 ```
 $ jupyter toree install --spark_home=/opt/spark-3.1.1-bin-hadoop3.2/ --user
 ```

Spark itself works fine via the terminal

 

Error message:

Starting Spark Kernel with SPARK_HOME=/opt/spark-3.1.1-bin-hadoop3.2/
 21/06/04 11:46:08 WARN Utils: Your hostname, debian-ds resolves to a loopback address: 127.0.1.1; using 10.0.6.51 instead (on interface eno1)
 21/06/04 11:46:08 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
 WARNING: An illegal reflective access operation has occurred
 WARNING: Illegal reflective access by org.apache.spark.unsafe.Platform ([file:/opt/spark-3.1.1-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.1.jar|file:///opt/spark-3.1.1-bin-hadoop3.2/jars/spark-unsafe_2.12-3.1.1.jar]) to constructor java.nio.DirectByteBuffer(long,int)
 WARNING: Please consider reporting this to the maintainers of org.apache.spark.unsafe.Platform
 WARNING: Use --illegal-access=warn to enable warnings of further illegal reflective access operations
 WARNING: All illegal access operations will be denied in a future release
 21/06/04 11:46:08 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
 Exception in thread "main" java.lang.NoClassDefFoundError: scala/App$class
 at org.apache.toree.Main$.<init>(Main.scala:24)
 at org.apache.toree.Main$.<clinit>(Main.scala)
 at org.apache.toree.Main.main(Main.scala)
 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.base/java.lang.reflect.Method.invoke(Method.java:566)
 at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
 at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:951)
 at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
 at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
 at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
 at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1030)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1039)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.lang.ClassNotFoundException: scala.App$class
 at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:471)
 at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:589)
 at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:522)



--
This message was sent by Atlassian Jira
(v8.3.4#803005)