You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2016/01/08 13:08:39 UTC

[jira] [Resolved] (SPARK-1927) Implicits declared in companion objects not found in Spark shell

     [ https://issues.apache.org/jira/browse/SPARK-1927?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Sean Owen resolved SPARK-1927.
------------------------------
    Resolution: Duplicate

> Implicits declared in companion objects not found in Spark shell
> ----------------------------------------------------------------
>
>                 Key: SPARK-1927
>                 URL: https://issues.apache.org/jira/browse/SPARK-1927
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Shell
>    Affects Versions: 0.9.0
>         Environment: Ubuntu Linux 14.04, Oracle Java 7u55
>            Reporter: Piotr Kołaczkowski
>
> {code}
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
> trait Mapper[T]
> class Foo
> object Foo { implicit object FooMapper extends Mapper[Foo] }
> // Exiting paste mode, now interpreting.
> defined trait Mapper
> defined class Foo
> defined module Foo
> scala> implicitly[Mapper[Foo]]
> <console>:28: error: could not find implicit value for parameter e: Mapper[Foo]
>               implicitly[Mapper[Foo]]
>                         ^
> {code}
> Exactly same example in the official Scala REPL (2.10.4):
> {code}
> scala> :paste
> // Entering paste mode (ctrl-D to finish)
> trait Mapper[T]
> class Foo
> object Foo { implicit object FooMapper extends Mapper[Foo] }
> // Exiting paste mode, now interpreting.
> defined trait Mapper
> defined class Foo
> defined module Foo
> scala> implicitly[Mapper[Foo]]
> res0: Mapper[Foo] = Foo$FooMapper$@4a20e9c6
> {code}
> I guess it might be another manifestation of the problem of everything being an inner object in Spark Repl. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org