You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "aldo (JIRA)" <ji...@apache.org> on 2017/04/06 09:38:41 UTC

[jira] [Comment Edited] (TOREE-399) Make Spark Kernel work on Windows

    [ https://issues.apache.org/jira/browse/TOREE-399?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15958656#comment-15958656 ] 

aldo edited comment on TOREE-399 at 4/6/17 9:38 AM:
----------------------------------------------------

Hi Jakob,

I created a quick run.bat with hardcoded values

%SPARK_HOME%/bin/spark-submit --class org.apache.toree.Main C:\ProgramData\jupyter\kernels\apache_toree_scala\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar

This passes the previous error, but still getting errors. I guess related to some scala config see below. Any idea?

Besides the error, with the goal to create a windows version of the run.sh, is not clear to me how kernel.json var are passed to the run.bat and how can I refer to them in run.bat.
Any direction?





17/03/31 09:55:29 [WARN] o.a.h.u.NativeCodeLoader - Unable to load native-hadoop library for your platform... using buil tin-java classes where applicable
17/03/31 09:55:30 [INFO] o.a.t.b.l.StandardComponentInitialization$$anon$1 - Connecting to spark.master local[*] [init] error: error while loading Object, Missing dependency 'object scala in compiler mirror', required by C:\Program F
iles\Java\jdk1.8.0_121\jre\lib\rt.jar(java/lang/Object.class)

Failed to initialize compiler: object scala in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.

Failed to initialize compiler: object scala in compiler mirror not found.
** Note that as of 2.8 scala does not assume use of the java classpath.
** For the old behavior pass -usejavacp to scala, or if using a Settings
** object programmatically, settings.usejavacp.value = true.
Exception in thread "main" java.lang.NullPointerException
        at scala.reflect.internal.SymbolTable.exitingPhase(SymbolTable.scala:256)
        at scala.tools.nsc.interpreter.IMain$Request.x$20$lzycompute(IMain.scala:896)
        at scala.tools.nsc.interpreter.IMain$Request.x$20(IMain.scala:895)
        at scala.tools.nsc.interpreter.IMain$Request.headerPreamble$lzycompute(IMain.scala:895)
        at scala.tools.nsc.interpreter.IMain$Request.headerPreamble(IMain.scala:895)
        at scala.tools.nsc.interpreter.IMain$Request$Wrapper.preamble(IMain.scala:918)
        at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1337)
        at scala.tools.nsc.interpreter.IMain$CodeAssembler$$anonfun$apply$23.apply(IMain.scala:1336)
        at scala.tools.nsc.util.package$.stringFromWriter(package.scala:64)
        at scala.tools.nsc.interpreter.IMain$CodeAssembler$class.apply(IMain.scala:1336)
        at scala.tools.nsc.interpreter.IMain$Request$Wrapper.apply(IMain.scala:908)
        at scala.tools.nsc.interpreter.IMain$Request.compile$lzycompute(IMain.scala:1002)
        at scala.tools.nsc.interpreter.IMain$Request.compile(IMain.scala:997)
        at scala.tools.nsc.interpreter.IMain.compile(IMain.scala:579)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:567)
        at scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:565)
        at org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$$anonfun$start$1.apply(ScalaInterpreterSpe
cific.scala:295)
        at org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$$anonfun$start$1.apply(ScalaInterpreterSpe
cific.scala:289)
        at scala.tools.nsc.interpreter.IMain.beQuietDuring(IMain.scala:214)
        at org.apache.toree.kernel.interpreter.scala.ScalaInterpreterSpecific$class.start(ScalaInterpreterSpecific.scala
:289)
        at org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.start(ScalaInterpreter.scala:44)
        at org.apache.toree.kernel.interpreter.scala.ScalaInterpreter.init(ScalaInterpreter.scala:87)
        at org.apache.toree.boot.layer.InterpreterManager$$anonfun$initializeInterpreters$1.apply(InterpreterManager.sca
la:35)



was (Author: alpajj):
Hi Jakob,

I created a quick run.bat with hardcoded values

%SPARK_HOME%/bin/spark-submit --class org.apache.toree.Main C:\ProgramData\jupyter\kernels\apache_toree_scala\lib\toree-assembly-0.2.0.dev1-incubating-SNAPSHOT.jar

This passes the previous error, but still getting errors. I guess related to some scala config see below. Any idea?

Besides the error, with the goal to create a windows version of the run.sh, is not clear to me how kernel.json var are passed to the run.bat and how can I refer to them in run.bat.
Any direction?


> Make Spark Kernel work on Windows
> ---------------------------------
>
>                 Key: TOREE-399
>                 URL: https://issues.apache.org/jira/browse/TOREE-399
>             Project: TOREE
>          Issue Type: New Feature
>         Environment: Windows 7/8/10
>            Reporter: aldo
>
> After a successful install of the Spark Kernel the error: "Failed to run command:" occurs when from jupyter we select a Scala Notebook.
> The error happens because the kernel.json runs C:\\ProgramData\\jupyter\\kernels\\apache_toree_scala\\bin\\run.sh which is bash shell script and hence cannot work on windows.
> Can you give me some direction to fix this, and I will implement it.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)