You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sean Owen (JIRA)" <ji...@apache.org> on 2014/10/22 08:04:33 UTC

[jira] [Commented] (SPARK-4044) Thriftserver fails to start when JAVA_HOME points to JRE instead of JDK

    [ https://issues.apache.org/jira/browse/SPARK-4044?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14179620#comment-14179620 ] 

Sean Owen commented on SPARK-4044:
----------------------------------

How about using {{unzip -l}} to probe the contents of the .jar files? They're just zip files after all. You check the exit status to see if it contained the entry in question -- 0 if it did, non-0 otherwise.

I am not sure how this will interact with the check for an invalid JAR file that is also in the script though.

> Thriftserver fails to start when JAVA_HOME points to JRE instead of JDK
> -----------------------------------------------------------------------
>
>                 Key: SPARK-4044
>                 URL: https://issues.apache.org/jira/browse/SPARK-4044
>             Project: Spark
>          Issue Type: Bug
>          Components: Documentation, SQL
>    Affects Versions: 1.1.0, 1.2.0
>            Reporter: Josh Rosen
>
> If {{JAVA_HOME}} points to a JRE instead of a JDK, e.g.
> {code}
> JAVA_HOME=/usr/lib/jvm/java-7-oracle/jre/ 
> {code}
> instead of 
> {code}
> JAVA_HOME=/usr/lib/jvm/java-7-oracle/
> {code}
> Then start-thriftserver.sh will fail with Datanucleus JAR errors:
> {code}
> Caused by: java.lang.ClassNotFoundException: org.datanucleus.api.jdo.JDOPersistenceManagerFactory
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
> 	at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
> 	at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
> 	at java.lang.Class.forName0(Native Method)
> 	at java.lang.Class.forName(Class.java:270)
> 	at javax.jdo.JDOHelper$18.run(JDOHelper.java:2018)
> 	at javax.jdo.JDOHelper$18.run(JDOHelper.java:2016)
> 	at java.security.AccessController.doPrivileged(Native Method)
> 	at javax.jdo.JDOHelper.forName(JDOHelper.java:2015)
> 	at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1162)
> {code}
> The root problem seems to be that {{compute-classpath.sh}} uses {{JAVA_HOME}} to find the path to the {{jar}} command, which isn't present in JRE directories.  This leads to silent failures when adding the Datanucleus JARs to the classpath.
> This same issue presumably affects the command that checks whether Spark was built on Java 7 but run on Java 6.
> We should probably add some error-handling that checks whether the {{jar}} command is actually present and throws an error otherwise, and also update the documentation to say that `JAVA_HOME` must point to a JDK and not a JRE.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org