You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "Yuri Bakumenko (JIRA)" <ji...@apache.org> on 2016/12/16 18:12:58 UTC

[jira] [Commented] (TOREE-336) Toree not working with Apache Spark 2.0.0

    [ https://issues.apache.org/jira/browse/TOREE-336?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15755122#comment-15755122 ] 

Yuri Bakumenko commented on TOREE-336:
--------------------------------------

[~mingsterism] the build process is using https://github.com/apache/incubator-toree/blob/master/Makefile that in turn has 2 relevant variable to override:

APACHE_SPARK_VERSION?=2.0.0
SCALA_VERSION?=2.11

So you can run for interactive jupyter session:
{code}
make dev APACHE_SPARK_VERSION=2.0.2
{code}

or in my case to prebuilt the release package to reuse it from other build step, I do:
{code}
make clean release APACHE_SPARK_VERSION=2.0.2
{code}


> Toree not working with Apache Spark 2.0.0
> -----------------------------------------
>
>                 Key: TOREE-336
>                 URL: https://issues.apache.org/jira/browse/TOREE-336
>             Project: TOREE
>          Issue Type: Bug
>         Environment: OSX and ubuntu-14.04, both running scala 2.10.4 and spark 2.0.0
>            Reporter: Tianhui Li
>   Original Estimate: 168h
>  Remaining Estimate: 168h
>
> Following the instructions on https://github.com/apache/incubator-toree/blob/master/README.md, I run
> ```
> pip install --pre toree
> jupyter toree install --spark-home=$SPARK_HOME
> ```
> I'm able to build fine.  But upon starting the server and a new scala (or any other type of notebook), I an error (provided below).  This seems related to using scala 2.10 rather than 2.11 (see http://stackoverflow.com/questions/29339005/run-main-0-java-lang-nosuchmethoderror-scala-collection-immutable-hashset-emp and http://stackoverflow.com/questions/30536759/running-a-spark-application-in-intellij-14-1-3).  Below is the error:
> $ jupyter notebook
> [I 12:11:59.464 NotebookApp] Serving notebooks from local directory: /Users/tianhui
> [I 12:11:59.464 NotebookApp] 0 active kernels 
> [I 12:11:59.465 NotebookApp] The Jupyter Notebook is running at: http://localhost:8888/
> [I 12:11:59.465 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
> [I 12:12:06.847 NotebookApp] 302 GET / (::1) 0.47ms
> [I 12:12:10.591 NotebookApp] Creating new notebook in 
> [I 12:12:11.600 NotebookApp] Kernel started: 20ca2e71-781b-4208-ad88-bc04c1ca37d6
> Starting Spark Kernel with SPARK_HOME=/usr/local/Cellar/apache-spark/2.0.0/libexec/
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Kernel version: 0.1.0.dev9-incubating-SNAPSHOT
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Scala version: Some(2.10.4)
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - ZeroMQ (JeroMQ) version: 3.2.2
> 16/09/03 12:12:12 [INFO] o.a.t.Main$$anon$1 - Initializing internal actor system
> Exception in thread "main" java.lang.NoSuchMethodError: scala.collection.immutable.HashSet$.empty()Lscala/collection/immutable/HashSet;
> 	at akka.actor.ActorCell$.<init>(ActorCell.scala:336)
> 	at akka.actor.ActorCell$.<clinit>(ActorCell.scala)
> 	at akka.actor.RootActorPath.$div(ActorPath.scala:185)
> 	at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:465)
> 	at akka.actor.LocalActorRefProvider.<init>(ActorRefProvider.scala:453)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> 	at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
> 	at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> 	at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
> 	at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$2.apply(DynamicAccess.scala:78)
> 	at scala.util.Try$.apply(Try.scala:192)
> 	at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:73)
> 	at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
> 	at akka.actor.ReflectiveDynamicAccess$$anonfun$createInstanceFor$3.apply(DynamicAccess.scala:84)
> 	at scala.util.Success.flatMap(Try.scala:231)
> 	at akka.actor.ReflectiveDynamicAccess.createInstanceFor(DynamicAccess.scala:84)
> 	at akka.actor.ActorSystemImpl.liftedTree1$1(ActorSystem.scala:585)
> 	at akka.actor.ActorSystemImpl.<init>(ActorSystem.scala:578)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:142)
> 	at akka.actor.ActorSystem$.apply(ActorSystem.scala:109)
> 	at org.apache.toree.boot.layer.StandardBareInitialization$class.createActorSystem(BareInitialization.scala:71)
> 	at org.apache.toree.Main$$anon$1.createActorSystem(Main.scala:35)
> 	at org.apache.toree.boot.layer.StandardBareInitialization$class.initializeBare(BareInitialization.scala:60)
> 	at org.apache.toree.Main$$anon$1.initializeBare(Main.scala:35)
> 	at org.apache.toree.boot.KernelBootstrap.initialize(KernelBootstrap.scala:70)
> 	at org.apache.toree.Main$delayedInit$body.apply(Main.scala:40)
> 	at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
> 	at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
> 	at scala.App$$anonfun$main$1.apply(App.scala:76)
> 	at scala.App$$anonfun$main$1.apply(App.scala:76)
> 	at scala.collection.immutable.List.foreach(List.scala:381)
> 	at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
> 	at scala.App$class.main(App.scala:76)
> 	at org.apache.toree.Main$.main(Main.scala:24)
> 	at org.apache.toree.Main.main(Main.scala)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> 	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> 	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> 	at java.lang.reflect.Method.invoke(Method.java:498)
> 	at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:729)
> 	at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185)
> 	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210)
> 	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
> 	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> [W 12:12:21.630 NotebookApp] Timeout waiting for kernel_info reply from 20ca2e71-781b-4208-ad88-bc04c1ca37d6



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)