You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "YONG FENG (JIRA)" <ji...@apache.org> on 2015/12/05 17:15:10 UTC

[jira] [Commented] (SPARK-12033) Build Spark Error:UNRESOLVED DEPENDENCIES org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1: not found

    [ https://issues.apache.org/jira/browse/SPARK-12033?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15043370#comment-15043370 ] 

YONG FENG commented on SPARK-12033:
-----------------------------------

Just show the solution in case others meet the same issue. It seems some maven links are broken. Manually downloading missed pom files from http://sofia2.org/nexus/content/groups/public/org/eclipse/paho/ could resolve it. BTW, I use maven default rep of http://repo1.maven.org.

> Build Spark Error:UNRESOLVED DEPENDENCIES org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1: not found
> ----------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-12033
>                 URL: https://issues.apache.org/jira/browse/SPARK-12033
>             Project: Spark
>          Issue Type: Question
>          Components: Build
>    Affects Versions: 1.5.2
>         Environment: Centos 7/scala 2.11.7/sbt 0.13.9/jdk 1.7.0_80
>            Reporter: wziyong
>            Priority: Minor
>              Labels: build, maven
>   Original Estimate: 12h
>  Remaining Estimate: 12h
>
> Recently,I start to learn Spark.Then ,I clone spark and try to build.but I'm worn out by the build error.
> The detail of error is follow:
> > streaming-mqtt/update
> [warn] There may be incompatibilities among your library dependencies.
> [warn] Here are some of the libraries that were evicted:
> [warn] 	* com.google.code.findbugs:jsr305:1.3.9 -> 2.0.1
> [warn] 	* commons-net:commons-net:2.2 -> 3.1
> [warn] Run 'evicted' to see detailed eviction warnings
> [info] Updating {file:/home/wziyong/git/spark/}streaming-mqtt...
> [info] Resolving org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1 ...
> [warn] 	module not found: org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1
> [warn] ==== local: tried
> [warn]   /home/wziyong/.ivy2/local/org.eclipse.paho/org.eclipse.paho.client.mqttv3/1.0.1/ivys/ivy.xml
> [warn] ==== jcenter: tried
> [warn]   https://jcenter.bintray.com/org/eclipse/paho/org.eclipse.paho.client.mqttv3/1.0.1/org.eclipse.paho.client.mqttv3-1.0.1.pom
> [warn] ==== typesafe-ivy-releases: tried
> [warn]   https://repo.typesafe.com/typesafe/ivy-releases/org.eclipse.paho/org.eclipse.paho.client.mqttv3/1.0.1/ivys/ivy.xml
> [warn] ==== public: tried
> [warn]   https://repo1.maven.org/maven2/org/eclipse/paho/org.eclipse.paho.client.mqttv3/1.0.1/org.eclipse.paho.client.mqttv3-1.0.1.pom
> [info] Resolving org.fusesource.jansi#jansi;1.4 ...
> [warn] 	::::::::::::::::::::::::::::::::::::::::::::::
> [warn] 	::          UNRESOLVED DEPENDENCIES         ::
> [warn] 	::::::::::::::::::::::::::::::::::::::::::::::
> [warn] 	:: org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1: not found
> [warn] 	::::::::::::::::::::::::::::::::::::::::::::::
> [warn] 
> [warn] 	Note: Unresolved dependencies path:
> [warn] 		org.eclipse.paho:org.eclipse.paho.client.mqttv3:1.0.1 ((com.typesafe.sbt.pom.MavenHelper) MavenHelper.scala#L76)
> [warn] 		  +- org.apache.spark:spark-streaming-mqtt_2.10:1.6.0-SNAPSHOT
> [trace] Stack trace suppressed: run last streaming-mqtt/*:update for the full output.
> [error] (streaming-mqtt/*:update) sbt.ResolveException: unresolved dependency: org.eclipse.paho#org.eclipse.paho.client.mqttv3;1.0.1: not found
> [error] Total time: 87 s, completed Nov 28, 2015 5:40:26 PM
> It seems that the repositories do not hold the org.eclipse.paho.client.mqttv3 !Why?How Can I fix it?  
> I'll appreciate it if anyone can help me.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org