You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2020/03/06 02:23:00 UTC
[jira] [Resolved] (SPARK-31043) Spark 3.0 built against hadoop2.7
can't start standalone master
[ https://issues.apache.org/jira/browse/SPARK-31043?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Hyukjin Kwon resolved SPARK-31043.
----------------------------------
Fix Version/s: 3.0.0
Resolution: Fixed
> Spark 3.0 built against hadoop2.7 can't start standalone master
> ---------------------------------------------------------------
>
> Key: SPARK-31043
> URL: https://issues.apache.org/jira/browse/SPARK-31043
> Project: Spark
> Issue Type: Bug
> Components: Build
> Affects Versions: 3.0.0
> Reporter: Thomas Graves
> Priority: Critical
> Fix For: 3.0.0
>
>
> trying to start a standalone master when building spark branch 3.0 with hadoop2.7 fails with:
>
> Exception in thread "main" java.lang.NoClassDefFoundError: org/w3c/dom/ElementTraversal
> at java.lang.ClassLoader.defineClass1(Native Method)
> at java.lang.ClassLoader.defineClass(ClassLoader.java:757)
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
> at
> [java.net|http://java.net/]
> .URLClassLoader.defineClass(URLClassLoader.java:468)
> at
> [java.net|http://java.net/]
> .URLClassLoader.access$100(URLClassLoader.java:74)
> at
> [java.net|http://java.net/]
> .URLClassLoader$1.run(URLClassLoader.java:369)
> ...
> Caused by: java.lang.ClassNotFoundException: org.w3c.dom.ElementTraversal
> at
> [java.net|http://java.net/]
> .URLClassLoader.findClass(URLClassLoader.java:382)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:419)
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352)
> at java.lang.ClassLoader.loadClass(ClassLoader.java:352)
> ... 42 more
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org