You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shixiong Zhu (JIRA)" <ji...@apache.org> on 2016/06/01 19:49:59 UTC

[jira] [Commented] (SPARK-14146) Imported implicits can't be found in Spark REPL in some cases

    [ https://issues.apache.org/jira/browse/SPARK-14146?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15310980#comment-15310980 ] 

Shixiong Zhu commented on SPARK-14146:
--------------------------------------

We still need a fix for Scala 2.10. FYI, for Scala 2.10, the fallback mode works:

{code}
scala> :fallback

Switched on fallback mode without restarting.
       If you have defined classes in the repl, it would
be good to redefine them incase you plan to use them. If you still run
into issues it would be good to restart the repl and turn on `:fallback`
mode as first command.
      

scala> class A; Seq(1 -> "a").toDS()
defined class A
res0: org.apache.spark.sql.Dataset[(Int, String)] = [_1: int, _2: string]

{code}

/cc [~prashant_] let's continue to discuss here.

> Imported implicits can't be found in Spark REPL in some cases
> -------------------------------------------------------------
>
>                 Key: SPARK-14146
>                 URL: https://issues.apache.org/jira/browse/SPARK-14146
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core, SQL
>    Affects Versions: 2.0.0
>            Reporter: Wenchen Fan
>
> {code}
> class I(i: Int) {
>   def double: Int = i * 2
> }
> class Context {
>   implicit def toI(i: Int): I = new I(i)
> }
> val c = new Context
> import c._
> // OK
> 1.double
> // Fail
> class A; 1.double
> {code}
> The above code snippets can work in Scala REPL however.
> This will affect our Dataset functionality, for example:
> {code}
> class A; Seq(1 -> "a").toDS() // fail
> {code}
> or in paste mode:
> {code}
> :paste
> class A
> Seq(1 -> "a").toDS() // fail
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org