You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Dan Adkins (JIRA)" <ji...@apache.org> on 2015/12/09 02:24:10 UTC
[jira] [Created] (SPARK-12223) Spark 1.5 pre-bulit releases don't
work with the java version shipped with Macs
Dan Adkins created SPARK-12223:
----------------------------------
Summary: Spark 1.5 pre-bulit releases don't work with the java version shipped with Macs
Key: SPARK-12223
URL: https://issues.apache.org/jira/browse/SPARK-12223
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 1.5.2, 1.5.1, 1.5.0
Environment: $ uname -a
Darwin grixis 14.5.0 Darwin Kernel Version 14.5.0: Tue Sep 1 21:23:09 PDT 2015; root:xnu-2782.50.1~1/RELEASE_X86_64 x86_64
$ java -version
java version "1.6.0_65"
Java(TM) SE Runtime Environment (build 1.6.0_65-b14-466.1-11M4716)
Java HotSpot(TM) 64-Bit Server VM (build 20.65-b04-466.1, mixed mode)
Reporter: Dan Adkins
Priority: Blocker
I downloaded the latest release (1.5.2) from [http://spark.apache.org/downloads.html] and attempted to execute step 1 of the Python quick start guide [http://spark.apache.org/docs/latest/quick-start.html].
$ ./bin/pyspark
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/spark/launcher/Main
Caused by: java.lang.ClassNotFoundException: org.apache.spark.launcher.Main
at java.net.URLClassLoader$1.run(URLClassLoader.java:202)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:190)
at java.lang.ClassLoader.loadClass(ClassLoader.java:306)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:301)
at java.lang.ClassLoader.loadClass(ClassLoader.java:247)
This looks similar to SPARK-1703 which is caused by attempting to run a Java7 jar with JRE 6. I reproduced the problem with all of the 1.5.x releases. This problem doesn't exist for me in version 1.4.1.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org