You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Kelly, Jonathan" <jo...@amazon.com> on 2015/03/30 22:03:29 UTC

Spark and OpenJDK - jar: No such file or directory

I'm trying to use OpenJDK 7 with Spark 1.3.0 and noticed that the compute-classpath.sh script is not adding the datanucleus jars to the classpath because compute-classpath.sh is assuming to find the jar command in $JAVA_HOME/bin/jar, which does not exist for OpenJDK.  Is this an issue anybody else has run into?  Would it be possible to use the unzip command instead?

The fact that $JAVA_HOME/bin/jar is missing also breaks the check that ensures that Spark was built with a compatible version of java to the one being used to launch spark.  The unzip tool of course wouldn't work for this, but there's probably another easy alternative to $JAVA_HOME/bin/jar.

~ Jonathan Kelly

Re: Spark and OpenJDK - jar: No such file or directory

Posted by "Kelly, Jonathan" <jo...@amazon.com>.
Ah, never mind, I found the jar command in the java-1.7.0-openjdk-devel package.  I only had java-1.7.0-openjdk installed.  Looks like I just need to install java-1.7.0-openjdk-devel then set JAVA_HOME to /usr/lib/jvm/java instead of /usr/lib/jvm/jre.

~ Jonathan Kelly

From: <Kelly>, Jonathan Kelly <jo...@amazon.com>>
Date: Monday, March 30, 2015 at 1:03 PM
To: "user@spark.apache.org<ma...@spark.apache.org>" <us...@spark.apache.org>>
Subject: Spark and OpenJDK - jar: No such file or directory

I'm trying to use OpenJDK 7 with Spark 1.3.0 and noticed that the compute-classpath.sh script is not adding the datanucleus jars to the classpath because compute-classpath.sh is assuming to find the jar command in $JAVA_HOME/bin/jar, which does not exist for OpenJDK.  Is this an issue anybody else has run into?  Would it be possible to use the unzip command instead?

The fact that $JAVA_HOME/bin/jar is missing also breaks the check that ensures that Spark was built with a compatible version of java to the one being used to launch spark.  The unzip tool of course wouldn't work for this, but there's probably another easy alternative to $JAVA_HOME/bin/jar.

~ Jonathan Kelly