You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Shardul Mahadik (Jira)" <ji...@apache.org> on 2021/04/14 15:15:00 UTC
[jira] [Updated] (SPARK-35072) spark.jars.ivysettings should
support local:// and hdfs:// schemes
[ https://issues.apache.org/jira/browse/SPARK-35072?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shardul Mahadik updated SPARK-35072:
------------------------------------
Description:
During reviews of SPARK-34472, there was a desire to support local:// and hdfs:// schemes and potentially other schemes with spark.jars.ivysettings. See https://github.com/apache/spark/pull/31591#discussion_r598850998 and https://github.com/apache/spark/pull/31591#issuecomment-817951152. Currently this fails with the following error:
{code:java}
./bin/spark-shell --conf spark.jars.packages="org.apache.commons:commons-lang:3.4" --conf spark.jars.ivySettings="local:///Use
rs/test/temp/ivysettings.xml"
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Ivy settings file local:/Users/test/temp/ivysettings.xml does not exist
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.deploy.SparkSubmitUtils$.loadIvySettings(SparkSubmit.scala:1288)
at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:176)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:895)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1031)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1040)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}
We should make sure that this also works with {{SparkContext#addJar}} when running in cluster mode. At this point, only YARN resource manager supports managing {{ivysettings}} file in cluster mode. This can change based on SPARK-35073.
was:
During reviews of SPARK-34472, there was a desire to support local:// and hdfs:// schemes and potentially other schemes with spark.jars.ivysettings. See https://github.com/apache/spark/pull/31591#discussion_r598850998 and https://github.com/apache/spark/pull/31591#issuecomment-817951152. Currently this fails with the following error:
{code:java}
./bin/spark-shell --conf spark.jars.packages="org.apache.commons:commons-lang:3.4" --conf spark.jars.ivySettings="local:///Use
rs/test/temp/ivysettings.xml"
Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Ivy settings file local:/Users/test/temp/ivysettings.xml does not exist
at scala.Predef$.require(Predef.scala:281)
at org.apache.spark.deploy.SparkSubmitUtils$.loadIvySettings(SparkSubmit.scala:1288)
at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:176)
at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308)
at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:895)
at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1031)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1040)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
{code}
We should make sure that this also works with {{SparkContext#addJar}} when running in cluster mode. At this point, only YARN resource manager supports managing {{ivysettings}} file in cluster mode.
> spark.jars.ivysettings should support local:// and hdfs:// schemes
> ------------------------------------------------------------------
>
> Key: SPARK-35072
> URL: https://issues.apache.org/jira/browse/SPARK-35072
> Project: Spark
> Issue Type: Improvement
> Components: Spark Core
> Affects Versions: 3.2.0
> Reporter: Shardul Mahadik
> Priority: Minor
>
> During reviews of SPARK-34472, there was a desire to support local:// and hdfs:// schemes and potentially other schemes with spark.jars.ivysettings. See https://github.com/apache/spark/pull/31591#discussion_r598850998 and https://github.com/apache/spark/pull/31591#issuecomment-817951152. Currently this fails with the following error:
> {code:java}
> ./bin/spark-shell --conf spark.jars.packages="org.apache.commons:commons-lang:3.4" --conf spark.jars.ivySettings="local:///Use
> rs/test/temp/ivysettings.xml"
> Exception in thread "main" java.lang.IllegalArgumentException: requirement failed: Ivy settings file local:/Users/test/temp/ivysettings.xml does not exist
> at scala.Predef$.require(Predef.scala:281)
> at org.apache.spark.deploy.SparkSubmitUtils$.loadIvySettings(SparkSubmit.scala:1288)
> at org.apache.spark.util.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:176)
> at org.apache.spark.deploy.SparkSubmit.prepareSubmitEnvironment(SparkSubmit.scala:308)
> at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:895)
> at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
> at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
> at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
> at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1031)
> at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1040)
> at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
> {code}
> We should make sure that this also works with {{SparkContext#addJar}} when running in cluster mode. At this point, only YARN resource manager supports managing {{ivysettings}} file in cluster mode. This can change based on SPARK-35073.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org