You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Nina Pakhomova (JIRA)" <ji...@apache.org> on 2015/10/27 23:15:28 UTC

[jira] [Closed] (SPARK-11357) Building Spark with maven doesn't add to spark-network-yarn_2.10-XXX.jar all necessary classes.

     [ https://issues.apache.org/jira/browse/SPARK-11357?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Nina Pakhomova closed SPARK-11357.
----------------------------------
    Resolution: Invalid

> Building Spark with maven doesn't add to spark-network-yarn_2.10-XXX.jar all necessary classes. 
> ------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-11357
>                 URL: https://issues.apache.org/jira/browse/SPARK-11357
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.5.1
>            Reporter: Nina Pakhomova
>            Priority: Minor
>
> After executing 
> {code:xml}
> mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0-cdh5.4.2  -Phive  -Phive-thriftserver -DskipTests clean package
> {code}
> file: spark-network-yarn_2.10-XXX.jar produced in $SPARK_HOME/network/yarn/target/scala-<version> as [http://spark.apache.org/docs/latest/job-scheduling.html#dynamic-resource-allocation] says. 
> But all files, which are in this jar are:
> {quote}
> META-INF	org
> ./META-INF:
> DEPENDENCIES	LICENSE		MANIFEST.MF	NOTICE		maven
> ./META-INF/maven:
> org.apache.spark
> ./META-INF/maven/org.apache.spark:
> spark-network-yarn_2.10
> ./META-INF/maven/org.apache.spark/spark-network-yarn_2.10:
> pom.properties	pom.xml
> ./org:
> apache
> ./org/apache:
> spark
> ./org/apache/spark:
> network
> ./org/apache/spark/network:
> yarn
> ./org/apache/spark/network/yarn:
> YarnShuffleService.class	util
> ./org/apache/spark/network/yarn/util:
> HadoopConfigProvider.class
> {quote}
> It lacks some classes and DataNode fails with ClassNotFound exceptions. When I download some distribution and use jar from there - everything working. I understand, that probably I just need to run make-destribution, but I want default approach [http://spark.apache.org/docs/latest/building-spark.html#building-with-buildmvn] works too.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org