You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Thomas Graves (JIRA)" <ji...@apache.org> on 2015/06/24 15:47:04 UTC

[jira] [Assigned] (SPARK-8574) org/apache/spark/unsafe doesn't honor the java source/target versions

     [ https://issues.apache.org/jira/browse/SPARK-8574?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Thomas Graves reassigned SPARK-8574:
------------------------------------

    Assignee: Thomas Graves

> org/apache/spark/unsafe doesn't honor the java source/target versions
> ---------------------------------------------------------------------
>
>                 Key: SPARK-8574
>                 URL: https://issues.apache.org/jira/browse/SPARK-8574
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.4.0
>            Reporter: Thomas Graves
>            Assignee: Thomas Graves
>
> I built spark using jdk8 and the default source compatibility in the pom is 1.6 so I expected to be able to run Spark with jdk7, but if fails because the unsafe code doesn't seem to be honoring the source/target compatibility options set in the top level pom.
> Exception in thread "main" java.lang.UnsupportedClassVersionError: org/apache/spark/unsafe/memory/MemoryAllocator : Unsupported major.minor version 52.0
>         at java.lang.ClassLoader.defineClass1(Native Method)
>         at java.lang.ClassLoader.defineClass(ClassLoader.java:791)
>         at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
>         at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
>         at org.apache.spark.SparkEnv$.create(SparkEnv.scala:392)
>         at org.apache.spark.SparkEnv$.createExecutorEnv(SparkEnv.scala:211)
>         at org.apache.spark.executor.CoarseGrainedExecutorBackend$$anonfun$run$1.apply$mcV$sp(CoarseGrainedExecutorBackend.scala:180)
>         at org.apache.spark.deploy.SparkHadoopUtil.runAsSparkUser(SparkHadoopUtil.scala:74)
>         at org.apache.spark.executor.CoarseGrainedExecutorBackend$.run(CoarseGrainedExecutorBackend.scala:146)
>         at org.apache.spark.executor.CoarseGrainedExecutorBackend$.main(CoarseGrainedExecutorBackend.scala:245)
>         at org.apache.spark.executor.CoarseGrainedExecutorBackend.main(CoarseGrainedExecutorBackend.scala)
> 15/06/23 19:48:24 INFO storage.DiskBlockManager: Shutdown hook called



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org