You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@toree.apache.org by "Jakob Odersky (JIRA)" <ji...@apache.org> on 2017/02/07 00:16:41 UTC

[jira] [Comment Edited] (TOREE-375) Incorrect fully qualified name for spark context

    [ https://issues.apache.org/jira/browse/TOREE-375?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15855019#comment-15855019 ] 

Jakob Odersky edited comment on TOREE-375 at 2/7/17 12:16 AM:
--------------------------------------------------------------

-Yrepl-class-based strikes again.

I managed to track this down to the [refreshDefinitions()|https://github.com/apache/incubator-toree/blob/master/scala-interpreter/src/main/scala-2.11/org/apache/toree/kernel/interpreter/scala/ScalaInterpreterSpecific.scala#L79-L97] function that is called when jars are added dynamically. It appears that the `valueOfTerm` method does not find a value associated to any variables. The following snippet illustrates that this behaviour only happens when the `-Yrepl-class-based` option is set in the repl (the option is required for Spark to correctly serialize objects).

{code}
import scala.tools.nsc.Settings
import scala.tools.nsc.interpreter._

object Main extends App {
  val settings = new Settings
  settings.usejavacp.value = true
  //settings.Yreplclassbased.value = true

  val iMain: IMain = new IMain(settings)
  iMain.initializeSynchronous()

  iMain.interpret("val x = 1")

  iMain.definedTerms.foreach { name =>
    println("defined term: " + name.toString)
    iMain.valueOfTerm(name.toString) match {
      case Some(value) => println("value: " + value)
      case None => println("no value")
    }
  }

}
{code}

Above codes yields:
{code}
[info] Running foo.Main 
[info] x: Int = 1
[info] defined term: x
[info] value: 1
{code}

When setting -Y-repl-class-based by uncommenting above comment:
{code}
[info] Running foo.Main 
[info] x: Int = 1
[info] defined term: x
[info] no value
{code}


was (Author: jodersky):
-Yrepl-class-based strikes again.

I managed to track this down to the [refreshDefinitions()|https://github.com/apache/incubator-toree/blob/master/scala-interpreter/src/main/scala-2.11/org/apache/toree/kernel/interpreter/scala/ScalaInterpreterSpecific.scala#L79-L97] function that is called when jars are added dynamically. It appears that the `valueOfTerm` method does not find a value associated to any variables. The following snippet illustrates that this behaviour only happens when the `-Yrepl-class-based` option is set in the repl (the option is required for Spark to correctly serialize objects).

{code}
object Main extends App {
  val settings = new Settings
  settings.usejavacp.value = true
  //settings.Yreplclassbased.value = true

  val iMain: IMain = new IMain(settings)
  iMain.initializeSynchronous()

  iMain.interpret("val x = 1")

  iMain.definedTerms.foreach { name =>
    println("defined term: " + name.toString)
    iMain.valueOfTerm(name.toString) match {
      case Some(value) => println("value: " + value)
      case None => println("no value")
    }
  }

}
{code}

Above codes yields:
{code}
[info] Running foo.Main 
[info] x: Int = 1
[info] defined term: x
[info] value: 1
{code}

When setting -Y-repl-class-based by uncommenting above comment:
{code}
[info] Running foo.Main 
[info] x: Int = 1
[info] defined term: x
[info] no value
{code}

> Incorrect fully qualified name for spark context
> ------------------------------------------------
>
>                 Key: TOREE-375
>                 URL: https://issues.apache.org/jira/browse/TOREE-375
>             Project: TOREE
>          Issue Type: Bug
>         Environment: Jupyter Notebook with Toree latest master (1a9c11f5f1381c15b691a716acd0e1f0432a9a35) and Spark 2.0.2, Scala 2.11
>            Reporter: Felix Schüler
>
> When running below snippet in a cell I get a compile error for the MLContext Constructor. Somehow the fully qualified name of the SparkContext gets messed up. 
> The same does not happen when I start a Spark shell with the --jars command and create the MLContext there.
> Snippet (the systemml jar is build with the latest master of SystemML):
> {code}
> %addjar file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar -f
> import org.apache.sysml.api.mlcontext._
> import org.apache.sysml.api.mlcontext.ScriptFactory._
> val ml = new MLContext(sc)
> Starting download from file:///home/felix/repos/incubator-systemml/target/systemml-0.13.0-incubating-SNAPSHOT.jar
> Finished download of systemml-0.13.0-incubating-SNAPSHOT.jar
> Name: Compile Error
> Message: <console>:25: error: overloaded method constructor MLContext with alternatives:
>   (x$1: org.apache.spark.api.java.JavaSparkContext)org.apache.sysml.api.mlcontext.MLContext <and>
>   (x$1: org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)org.apache.sysml.api.mlcontext.MLContext
>  cannot be applied to (org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.org.apache.spark.SparkContext)
>        val ml = new MLContext(sc)
>                 ^
> StackTrace: 
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)