You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/06/30 16:19:24 UTC
[jira] [Commented] (SPARK-2330) Spark shell has weird scala
semantics
[ https://issues.apache.org/jira/browse/SPARK-2330?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14047689#comment-14047689 ]
Sean Owen commented on SPARK-2330:
----------------------------------
I can't reproduce this in HEAD right now. Try that?
This also sounds like a potential duplicate of https://issues.apache.org/jira/browse/SPARK-1199
> Spark shell has weird scala semantics
> -------------------------------------
>
> Key: SPARK-2330
> URL: https://issues.apache.org/jira/browse/SPARK-2330
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 0.9.1, 1.0.0
> Environment: Ubuntu 14.04 with spark-x.x.x-bin-hadoop2
> Reporter: Andrea Ferretti
> Labels: scala, shell
>
> Normal scala expressions are interpreted in a strange way in the spark shell. For instance
> {noformat}
> case class Foo(x: Int)
> def print(f: Foo) = f.x
> val f = Foo(3)
> print(f)
> <console>:24: error: type mismatch;
> found : Foo
> required: Foo
> {noformat}
> For another example
> {noformat}
> trait Currency
> case object EUR extends Currency
> case object USD extends Currency
> def nextCurrency: Currency = nextInt(2) match {
> case 0 => EUR
> case _ => USD
> }
> <console>:22: error: type mismatch;
> found : EUR.type
> required: Currency
> case 0 => EUR
> <console>:24: error: type mismatch;
> found : USD.type
> required: Currency
> case _ => USD
> {noformat}
--
This message was sent by Atlassian JIRA
(v6.2#6252)