You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Patrick Wendell (JIRA)" <ji...@apache.org> on 2014/11/10 08:02:33 UTC
[jira] [Resolved] (SPARK-3191) Add explanation of supporting
building spark with maven in http proxy environment
[ https://issues.apache.org/jira/browse/SPARK-3191?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Patrick Wendell resolved SPARK-3191.
------------------------------------
Resolution: Won't Fix
Hi there - I thought a bit more about this and I think we probably shouldn't explicitly tell users to disable security settings. I've never heard of a user report this issue before - so it doesn't seem super common. However, let's do this. If we have other users report this issue again, we can add something to the docs and we'll use this patch as a starting point. So let's close this issue for now and re-open it later if necessary.
> Add explanation of supporting building spark with maven in http proxy environment
> ---------------------------------------------------------------------------------
>
> Key: SPARK-3191
> URL: https://issues.apache.org/jira/browse/SPARK-3191
> Project: Spark
> Issue Type: Documentation
> Components: Documentation
> Affects Versions: 1.0.2
> Environment: linux suse 11
> maven version:apache-maven-3.0.5
> spark version: 1.0.1
> proxy setting of maven is:
> <proxy>
> <id>lzb</id>
> <active>true</active>
> <protocol>http</protocol>
> <username>user</username>
> <password>password</password>
> <host>proxy.company.com</host>
> <port>8080</port>
> <nonProxyHosts>*.company.com</nonProxyHosts>
> </proxy>
> Reporter: zhengbing li
> Priority: Trivial
> Labels: build, maven
> Fix For: 1.2.0
>
> Original Estimate: 1h
> Remaining Estimate: 1h
>
> When I use "mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -DskipTests clean package" in http proxy enviroment, I cannot finish this task.Error is as follows:
> [INFO] Spark Project YARN Stable API ..................... SUCCESS [34.217s]
> [INFO] Spark Project Assembly ............................ FAILURE [43.133s]
> [INFO] Spark Project External Twitter .................... SKIPPED
> [INFO] Spark Project External Kafka ...................... SKIPPED
> [INFO] Spark Project External Flume ...................... SKIPPED
> [INFO] Spark Project External ZeroMQ ..................... SKIPPED
> [INFO] Spark Project External MQTT ....................... SKIPPED
> [INFO] Spark Project Examples ............................ SKIPPED
> [INFO] ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] ------------------------------------------------------------------------
> [INFO] Total time: 27:57.309s
> [INFO] Finished at: Sat Aug 23 09:43:21 CST 2014
> [INFO] Final Memory: 51M/1080M
> [INFO] ------------------------------------------------------------------------
> [ERROR] Failed to execute goal org.apache.maven.plugins:maven-shade-plugin:2.2:shade (default) on project spark-assembly_2.10: Execution default of goal org.apache.maven.plugins:maven-shade-plugin:2.2:shade failed: Plugin org.apache.maven.plugins:maven-shade-plugin:2.2 or one of its dependencies could not be resolved: Could not find artifact com.google.code.findbugs:jsr305:jar:1.3.9 -> [Help 1]
> [ERROR]
> [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
> [ERROR] Re-run Maven using the -X switch to enable full debug logging.
> [ERROR]
> [ERROR] For more information about the errors and possible solutions, please read the following articles:
> [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/PluginResolutionException
> [ERROR]
> [ERROR] After correcting the problems, you can resume the build with the command
> [ERROR] mvn <goals> -rf :spark-assembly_2.10
> If you use this command, It is ok
> mvn -Pyarn -Phadoop-2.2 -Dhadoop.version=2.2.0 -Dmaven.wagon.http.ssl.insecure=true -Dmaven.wagon.http.ssl.allowall=true -DskipTests clean package
> The error is not very obvious, I spent a long time to solve this issues
> In order to facilitate other guys who use spark in http proxy environment, I highly recommed add this to documents
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org