You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Malcolm Greaves (JIRA)" <ji...@apache.org> on 2015/08/14 22:41:45 UTC

[jira] [Commented] (SPARK-6152) Spark does not support Java 8 compiled Scala classes

    [ https://issues.apache.org/jira/browse/SPARK-6152?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14697716#comment-14697716 ] 

Malcolm Greaves commented on SPARK-6152:
----------------------------------------

Interesting [~stevel@apache.org]! What kinds of changes do you think this would require -- mostly verifying that there's backward compatibility with those serialized classes?

> Spark does not support Java 8 compiled Scala classes
> ----------------------------------------------------
>
>                 Key: SPARK-6152
>                 URL: https://issues.apache.org/jira/browse/SPARK-6152
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.2.1
>         Environment: Java 8+
> Scala 2.11
>            Reporter: Ronald Chen
>            Priority: Minor
>
> Spark uses reflectasm to check Scala closures which fails if the *user defined Scala closures* are compiled to Java 8 class version
> The cause is reflectasm does not support Java 8
> https://github.com/EsotericSoftware/reflectasm/issues/35
> Workaround:
> Don't compile Scala classes to Java 8, Scala 2.11 does not support nor require any Java 8 features
> Stack trace:
> {code}
> java.lang.IllegalArgumentException
> 	at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
> 	at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
> 	at com.esotericsoftware.reflectasm.shaded.org.objectweb.asm.ClassReader.<init>(Unknown Source)
> 	at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$getClassReader(ClosureCleaner.scala:41)
> 	at org.apache.spark.util.ClosureCleaner$.getInnerClasses(ClosureCleaner.scala:84)
> 	at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:107)
> 	at org.apache.spark.SparkContext.clean(SparkContext.scala:1478)
> 	at org.apache.spark.rdd.RDD.map(RDD.scala:288)
> 	at ...my Scala 2.11 compiled to Java 8 code calling into spark
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org