You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by "Ulanov, Alexander" <al...@hpe.com> on 2016/09/09 17:37:31 UTC

accessing spark packages through proxy

Dear Spark users,

I am trying to use spark packages, however I get the ivy error listed below. I checked JIRA and stackoverflow and it might be a proxy error. However, neither of proposed solutions did not work for me. Could you suggest how to solve this issue?

https://issues.apache.org/jira/browse/SPARK-11085
http://stackoverflow.com/questions/36676395/how-to-resolve-external-packages-with-spark-shell-when-behind-a-corporate-proxy

[hduser@server bin]$ ./spark-shell --packages com.databricks:spark-csv_2.11:1.2.0
Ivy Default Cache set to: /home/hduser/.ivy2/cache
The jars for the packages stored in: /home/hduser/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/spark-mlp/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.databricks#spark-csv_2.11 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
        confs: [default]
[Fatal Error] ivy-1.2.0.xml.original:2:10: Already seen doctype.
:: resolution report :: resolve 987ms :: artifacts dl 0ms
        :: modules in use:
        ---------------------------------------------------------------------
        |                  |            modules            ||   artifacts   |
        |       conf       | number| search|dwnlded|evicted|| number|dwnlded|
        ---------------------------------------------------------------------
        |      default     |   1   |   0   |   0   |   0   ||   0   |   0   |
        ---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
                ::::::::::::::::::::::::::::::::::::::::::::::

                ::          UNRESOLVED DEPENDENCIES         ::

                ::::::::::::::::::::::::::::::::::::::::::::::

                :: com.databricks#spark-csv_2.11;1.2.0: java.text.ParseException: Already seen doctype.

                ::::::::::::::::::::::::::::::::::::::::::::::


:::: ERRORS
        Server access error at url https://repo1.maven.org/maven2/com/databricks/spark-csv_2.11/1.2.0/spark-csv_2.11-1.2.0.pom (javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No name matching repo1.maven.org found)

        Server access error at url https://repo1.maven.org/maven2/com/databricks/spark-csv_2.11/1.2.0/spark-csv_2.11-1.2.0.jar (javax.net.ssl.SSLHandshakeException: java.security.cert.CertificateException: No name matching repo1.maven.org found)

        Already seen doctype.


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: com.databricks#spark-csv_2.11;1.2.0: java.text.ParseException: Already seen doctype.]
        at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1066)
        at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:294)
        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:158)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Best regards, Alexander