You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wang Yanlin (JIRA)" <ji...@apache.org> on 2019/07/11 03:05:00 UTC

[jira] [Commented] (SPARK-28337) spark jars do not contain commons-jxpath jar, cause ClassNotFound exception

    [ https://issues.apache.org/jira/browse/SPARK-28337?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16882619#comment-16882619 ] 

Wang Yanlin commented on SPARK-28337:
-------------------------------------

my pom configuration for shade commons-configuration and common-jxpath.

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
 xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
 <parent>
 <artifactId>yanzhi-app</artifactId>
 <groupId>com.alipay.yanzhi</groupId>
 <version>1.0-SNAPSHOT</version>
 </parent>
 <modelVersion>4.0.0</modelVersion>

 <artifactId>spark-app</artifactId>
 <packaging>jar</packaging>

 <properties>
 <yanzhi.shade.packageName>com.yanzhi_project</yanzhi.shade.packageName>
 </properties>

 <dependencies>
 <dependency>
 <groupId>commons-jxpath</groupId>
 <artifactId>commons-jxpath</artifactId>
 <version>1.3</version>
 </dependency>
 <dependency>
 <groupId>commons-configuration</groupId>
 <artifactId>commons-configuration</artifactId>
 <version>1.5</version>
 <scope>compile</scope>
 </dependency>
 </dependencies>

 <build>
 <plugins>
 <plugin>
 <groupId>org.apache.maven.plugins</groupId>
 <artifactId>maven-shade-plugin</artifactId>
 <configuration>
 <shadedArtifactAttached>false</shadedArtifactAttached>
 <artifactSet>
 <includes>
 <include>commons-jxpath:commons-jxpath</include>
 <include>commons-configuration:commons-configuration</include>
 </includes>
 </artifactSet>
 <filters>
 <filter>
 <artifact>*:*</artifact>
 <excludes>
 <exclude>META-INF/*.SF</exclude>
 <exclude>META-INF/*.DSA</exclude>
 <exclude>META-INF/*.RSA</exclude>
 </excludes>
 </filter>
 </filters>
 <!--
 <relocations>
 <relocation>
 <pattern>org.apache.commons</pattern>
 <shadedPattern>${yanzhi.shade.packageName}.commons</shadedPattern>
 </relocation>
 </relocations>
 -->
 </configuration>
 <executions>
 <execution>
 <phase>package</phase>
 <goals>
 <goal>shade</goal>
 </goals>
 </execution>
 </executions>
 </plugin>
 </plugins>
 </build>

</project>

> spark jars do not contain commons-jxpath jar, cause ClassNotFound exception
> ---------------------------------------------------------------------------
>
>                 Key: SPARK-28337
>                 URL: https://issues.apache.org/jira/browse/SPARK-28337
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Submit
>    Affects Versions: 3.0.0
>            Reporter: Wang Yanlin
>            Priority: Major
>         Attachments: exception_stack_trace.jpg, shde_in_pom.jpg
>
>
> When submit a spark application using XPathExpressionEngine, a ClassNotFound exception will be thrown, even though I shade then class of commons-jxpath:commons-jxpath into my jar.
> The main class of my jar is as follows:
> object StaticTestMain {
>     def main(args: Array[String]): Unit = {
>        println(s"yanzhi class loader info ${getClass.getClassLoader}")
>         val engine = new XPathExpressionEngine() // exception happends in this line
>         // some other codes.
>     }
>  }
> And configuration of shade plugin as the attached picture file "shade_in_pom.jpg"
> The exception stack trace is in the attached picture file "exception_stack_trace.jpg".



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org