You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Igor Tkachenko (JIRA)" <ji...@apache.org> on 2014/10/01 18:07:37 UTC

[jira] [Updated] (SPARK-3761) Class anonfun$1 not found exception / sbt 13.5 / Scala 2.10.4

     [ https://issues.apache.org/jira/browse/SPARK-3761?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Igor Tkachenko updated SPARK-3761:
----------------------------------
    Summary: Class anonfun$1 not found exception / sbt 13.5 / Scala 2.10.4  (was: Class not found exception / sbt 13.5 / Scala 2.10.4)

> Class anonfun$1 not found exception / sbt 13.5 / Scala 2.10.4
> -------------------------------------------------------------
>
>                 Key: SPARK-3761
>                 URL: https://issues.apache.org/jira/browse/SPARK-3761
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 1.0.0
>            Reporter: Igor Tkachenko
>
> I have Scala code:
> val master = "spark://<server address>:7077"
>     val sc = new SparkContext(new SparkConf()
>       .setMaster(master)
>       .setAppName("SparkQueryDemo 01")
>       .set("spark.executor.memory", "512m"))
> val count2 = sc         .textFile("hdfs://<server address>:8020/tmp/data/risk/account.txt")
>       .filter(line  => line.contains("Word"))
>       .count()
> I've got such an error:
> [error] (run-main-0) org.apache.spark.SparkException: Job aborted due to stage failure: Task 0.0:0 failed 4 times, most
> recent failure: Exception failure in TID 6 on host <server address>: java.lang.ClassNotFoundExcept
> ion: SimpleApp$$anonfun$1
> My dependencies :
> object Version {
>   val spark        = "1.0.0-cdh5.1.0"
> }
> object Library {
>   val sparkCore      = "org.apache.spark"  % "spark-assembly_2.10"  % Version.spark
> }
> My OS is Win 7



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org