You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Shay Seng <sh...@1618labs.com> on 2013/10/12 01:55:20 UTC

Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Hey,
I seeing a funny situation where a piece of code executes in a pure Scala
REPL but not in a Spark-shell.
I'm using Scala 2.9.3 with Spark 0.8.0

In Spark I see:
class Animal() {
    def says():String = "???"
}

val amimal = new Animal
amimal: this.Animal = Animal@df27cd5

class Zoo[A <: Animal](thing: A) {
    def whoami()=thing.getClass
    def chat()=thing.says
}

val z = new Zoo[Animal](amimal)
<console>:16: error: type mismatch;
 found   : this.Animal
 required: this.Animal
       val z = new Zoo[Animal](amimal)
                                 ^

But if I run the exact code in the scala REPL:

val z = new Zoo[Animal](amimal)
z: Zoo[Animal] = Zoo@738ff53f


Both repl report using scala 2.9.3
Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java
1.7.0_40)"
Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM,
Java 1.7.0_40)."
Any ideas?

tks,
Shay

Re: Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Posted by Matei Zaharia <ma...@gmail.com>.
We're still not using macros in the 2.10 branch, so this issue will still happen there. We may do macros later but it's a fair bit of work so I wouldn't guarantee that it happens in our first 2.10 release.

Matei

On Oct 12, 2013, at 2:33 PM, Mark Hamstra <ma...@clearstorydata.com> wrote:

> That's a TODO that is either now possible in the 2.10 branch or pretty close to possible -- which isn't the same thing as easy.
> 
> 
> On Sat, Oct 12, 2013 at 2:20 PM, Aaron Davidson <il...@gmail.com> wrote:
> Out of curiosity, does the Scala 2.10 Spark interpreter patch fix this using macros as Matei suggests in the linked discussion? Or is that still future work, but now possible?
> 
> 
> On Fri, Oct 11, 2013 at 6:04 PM, Reynold Xin <rx...@apache.org> wrote:
> This is a known problem and has to do with peculiarity of the Scala shell:
> 
> https://groups.google.com/forum/#!searchin/spark-users/error$3A$20type$20mismatch|sort:relevance/spark-users/bwAmbUgxWrA/HwP4Nv4adfEJ
> 
> 
> On Fri, Oct 11, 2013 at 6:01 PM, Aaron Davidson <il...@gmail.com> wrote:
> Playing around with this a little more, it seems that classOf[Animal] is "this.Animal" in Spark and "Animal" in normal Scala.
> 
> Also, trying to do something like this:
> class Zoo[A <: this.Animal](thing: A) { }
> 
> works in Scala but throws a weird error in Spark:
> "error: type Animal is not a member of this.$iwC"
> 
> 
> On Fri, Oct 11, 2013 at 4:55 PM, Shay Seng <sh...@1618labs.com> wrote:
> Hey, 
> I seeing a funny situation where a piece of code executes in a pure Scala REPL but not in a Spark-shell. 
> I'm using Scala 2.9.3 with Spark 0.8.0
> 
> In Spark I see:
> class Animal() {
>     def says():String = "???"
> }
> 
> val amimal = new Animal
> amimal: this.Animal = Animal@df27cd5
> 
> class Zoo[A <: Animal](thing: A) {
>     def whoami()=thing.getClass
>     def chat()=thing.says
> }
> 
> val z = new Zoo[Animal](amimal)
> <console>:16: error: type mismatch;
>  found   : this.Animal
>  required: this.Animal
>        val z = new Zoo[Animal](amimal)
>                                  ^
> 
> But if I run the exact code in the scala REPL:
> 
> val z = new Zoo[Animal](amimal)
> z: Zoo[Animal] = Zoo@738ff53f
> 
> 
> Both repl report using scala 2.9.3
> Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_40)"
> Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_40)."
> Any ideas?
> 
> tks,
> Shay
> 
> 
> 
> 


Re: Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Posted by Mark Hamstra <ma...@clearstorydata.com>.
That's a TODO that is either now possible in the 2.10 branch or pretty
close to possible -- which isn't the same thing as easy.


On Sat, Oct 12, 2013 at 2:20 PM, Aaron Davidson <il...@gmail.com> wrote:

> Out of curiosity, does the Scala 2.10 Spark interpreter patch
> fix this using macros as Matei suggests in the linked discussion? Or is
> that still future work, but now possible?
>
>
> On Fri, Oct 11, 2013 at 6:04 PM, Reynold Xin <rx...@apache.org> wrote:
>
>> This is a known problem and has to do with peculiarity of the Scala shell:
>>
>>
>> https://groups.google.com/forum/#!searchin/spark-users/error$3A$20type$20mismatch|sort:relevance/spark-users/bwAmbUgxWrA/HwP4Nv4adfEJ
>>
>>
>> On Fri, Oct 11, 2013 at 6:01 PM, Aaron Davidson <il...@gmail.com>wrote:
>>
>>> Playing around with this a little more, it seems that classOf[Animal] is
>>> "this.Animal" in Spark and "Animal" in normal Scala.
>>>
>>> Also, trying to do something like this:
>>> class Zoo[A <: *this.*Animal](thing: A) { }
>>>
>>> works in Scala but throws a weird error in Spark:
>>> "error: type Animal is not a member of this.$iwC"
>>>
>>>
>>> On Fri, Oct 11, 2013 at 4:55 PM, Shay Seng <sh...@1618labs.com> wrote:
>>>
>>>> Hey,
>>>> I seeing a funny situation where a piece of code executes in a pure
>>>> Scala REPL but not in a Spark-shell.
>>>> I'm using Scala 2.9.3 with Spark 0.8.0
>>>>
>>>> In Spark I see:
>>>> class Animal() {
>>>>     def says():String = "???"
>>>> }
>>>>
>>>> val amimal = new Animal
>>>> amimal: this.Animal = Animal@df27cd5
>>>>
>>>> class Zoo[A <: Animal](thing: A) {
>>>>     def whoami()=thing.getClass
>>>>     def chat()=thing.says
>>>> }
>>>>
>>>> val z = new Zoo[Animal](amimal)
>>>> <console>:16: error: type mismatch;
>>>>  found   : this.Animal
>>>>  required: this.Animal
>>>>        val z = new Zoo[Animal](amimal)
>>>>                                  ^
>>>>
>>>> But if I run the exact code in the scala REPL:
>>>>
>>>> val z = new Zoo[Animal](amimal)
>>>> z: Zoo[Animal] = Zoo@738ff53f
>>>>
>>>>
>>>> Both repl report using scala 2.9.3
>>>> Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM,
>>>> Java 1.7.0_40)"
>>>> Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server
>>>> VM, Java 1.7.0_40)."
>>>> Any ideas?
>>>>
>>>> tks,
>>>> Shay
>>>>
>>>
>>>
>>
>

Re: Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Posted by Aaron Davidson <il...@gmail.com>.
Out of curiosity, does the Scala 2.10 Spark interpreter patch
fix this using macros as Matei suggests in the linked discussion? Or is
that still future work, but now possible?


On Fri, Oct 11, 2013 at 6:04 PM, Reynold Xin <rx...@apache.org> wrote:

> This is a known problem and has to do with peculiarity of the Scala shell:
>
>
> https://groups.google.com/forum/#!searchin/spark-users/error$3A$20type$20mismatch|sort:relevance/spark-users/bwAmbUgxWrA/HwP4Nv4adfEJ
>
>
> On Fri, Oct 11, 2013 at 6:01 PM, Aaron Davidson <il...@gmail.com>wrote:
>
>> Playing around with this a little more, it seems that classOf[Animal] is
>> "this.Animal" in Spark and "Animal" in normal Scala.
>>
>> Also, trying to do something like this:
>> class Zoo[A <: *this.*Animal](thing: A) { }
>>
>> works in Scala but throws a weird error in Spark:
>> "error: type Animal is not a member of this.$iwC"
>>
>>
>> On Fri, Oct 11, 2013 at 4:55 PM, Shay Seng <sh...@1618labs.com> wrote:
>>
>>> Hey,
>>> I seeing a funny situation where a piece of code executes in a pure
>>> Scala REPL but not in a Spark-shell.
>>> I'm using Scala 2.9.3 with Spark 0.8.0
>>>
>>> In Spark I see:
>>> class Animal() {
>>>     def says():String = "???"
>>> }
>>>
>>> val amimal = new Animal
>>> amimal: this.Animal = Animal@df27cd5
>>>
>>> class Zoo[A <: Animal](thing: A) {
>>>     def whoami()=thing.getClass
>>>     def chat()=thing.says
>>> }
>>>
>>> val z = new Zoo[Animal](amimal)
>>> <console>:16: error: type mismatch;
>>>  found   : this.Animal
>>>  required: this.Animal
>>>        val z = new Zoo[Animal](amimal)
>>>                                  ^
>>>
>>> But if I run the exact code in the scala REPL:
>>>
>>> val z = new Zoo[Animal](amimal)
>>> z: Zoo[Animal] = Zoo@738ff53f
>>>
>>>
>>> Both repl report using scala 2.9.3
>>> Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM,
>>> Java 1.7.0_40)"
>>> Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server
>>> VM, Java 1.7.0_40)."
>>> Any ideas?
>>>
>>> tks,
>>> Shay
>>>
>>
>>
>

Re: Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Posted by Reynold Xin <rx...@apache.org>.
This is a known problem and has to do with peculiarity of the Scala shell:

https://groups.google.com/forum/#!searchin/spark-users/error$3A$20type$20mismatch|sort:relevance/spark-users/bwAmbUgxWrA/HwP4Nv4adfEJ


On Fri, Oct 11, 2013 at 6:01 PM, Aaron Davidson <il...@gmail.com> wrote:

> Playing around with this a little more, it seems that classOf[Animal] is
> "this.Animal" in Spark and "Animal" in normal Scala.
>
> Also, trying to do something like this:
> class Zoo[A <: *this.*Animal](thing: A) { }
>
> works in Scala but throws a weird error in Spark:
> "error: type Animal is not a member of this.$iwC"
>
>
> On Fri, Oct 11, 2013 at 4:55 PM, Shay Seng <sh...@1618labs.com> wrote:
>
>> Hey,
>> I seeing a funny situation where a piece of code executes in a pure Scala
>> REPL but not in a Spark-shell.
>> I'm using Scala 2.9.3 with Spark 0.8.0
>>
>> In Spark I see:
>> class Animal() {
>>     def says():String = "???"
>> }
>>
>> val amimal = new Animal
>> amimal: this.Animal = Animal@df27cd5
>>
>> class Zoo[A <: Animal](thing: A) {
>>     def whoami()=thing.getClass
>>     def chat()=thing.says
>> }
>>
>> val z = new Zoo[Animal](amimal)
>> <console>:16: error: type mismatch;
>>  found   : this.Animal
>>  required: this.Animal
>>        val z = new Zoo[Animal](amimal)
>>                                  ^
>>
>> But if I run the exact code in the scala REPL:
>>
>> val z = new Zoo[Animal](amimal)
>> z: Zoo[Animal] = Zoo@738ff53f
>>
>>
>> Both repl report using scala 2.9.3
>> Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM,
>> Java 1.7.0_40)"
>> Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server
>> VM, Java 1.7.0_40)."
>> Any ideas?
>>
>> tks,
>> Shay
>>
>
>

Re: Spark REPL produces error on a piece of scala code that works in pure Scala REPL

Posted by Aaron Davidson <il...@gmail.com>.
Playing around with this a little more, it seems that classOf[Animal] is
"this.Animal" in Spark and "Animal" in normal Scala.

Also, trying to do something like this:
class Zoo[A <: *this.*Animal](thing: A) { }

works in Scala but throws a weird error in Spark:
"error: type Animal is not a member of this.$iwC"


On Fri, Oct 11, 2013 at 4:55 PM, Shay Seng <sh...@1618labs.com> wrote:

> Hey,
> I seeing a funny situation where a piece of code executes in a pure Scala
> REPL but not in a Spark-shell.
> I'm using Scala 2.9.3 with Spark 0.8.0
>
> In Spark I see:
> class Animal() {
>     def says():String = "???"
> }
>
> val amimal = new Animal
> amimal: this.Animal = Animal@df27cd5
>
> class Zoo[A <: Animal](thing: A) {
>     def whoami()=thing.getClass
>     def chat()=thing.says
> }
>
> val z = new Zoo[Animal](amimal)
> <console>:16: error: type mismatch;
>  found   : this.Animal
>  required: this.Animal
>        val z = new Zoo[Animal](amimal)
>                                  ^
>
> But if I run the exact code in the scala REPL:
>
> val z = new Zoo[Animal](amimal)
> z: Zoo[Animal] = Zoo@738ff53f
>
>
> Both repl report using scala 2.9.3
> Spark: "Using Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM, Java
> 1.7.0_40)"
> Scala: "Welcome to Scala version 2.9.3 (Java HotSpot(TM) 64-Bit Server VM,
> Java 1.7.0_40)."
> Any ideas?
>
> tks,
> Shay
>