You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Andrea Ferretti (JIRA)" <ji...@apache.org> on 2014/06/30 16:11:25 UTC
[jira] [Created] (SPARK-2330) Spark shell has weird scala semantics
Andrea Ferretti created SPARK-2330:
--------------------------------------
Summary: Spark shell has weird scala semantics
Key: SPARK-2330
URL: https://issues.apache.org/jira/browse/SPARK-2330
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 1.0.0, 0.9.1
Environment: Ubuntu 14.04 with spark-x.x.x-bin-hadoop2
Reporter: Andrea Ferretti
Normal scala expressions are interpreted in a strange way in the spark shell. For instance
case class Foo(x: Int)
def print(f: Foo) = f.x
val f = Foo(3)
print(f)
<console>:24: error: type mismatch;
found : Foo
required: Foo
For another example
trait Currency
case object EUR extends Currency
case object USD extends Currency
def nextCurrency: Currency = nextInt(2) match {
case 0 => EUR
case _ => USD
}
<console>:22: error: type mismatch;
found : EUR.type
required: Currency
case 0 => EUR
<console>:24: error: type mismatch;
found : USD.type
required: Currency
case _ => USD
--
This message was sent by Atlassian JIRA
(v6.2#6252)