You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Bruno Aranda <ba...@apache.org> on 2018/05/23 18:16:44 UTC

Cannot make Spark to honour the spark.jars.ivySettings config

Hi,

I am trying to use my own ivy settings file. For that, I am submitting to
Spark using a command such as the following to test:

spark-shell --packages some-org:some-artifact:102 --conf
spark.jars.ivySettings=/home/hadoop/ivysettings.xml

The idea is to be able to get the artifact from a private repository.

The ivy settings file at /home/hadoop/ivysettings.xml does exist and it is
valid. I have used it to resolve the artifact successfully, with something
such as:

java -jar ivy-2.4.0.jar -settings /home/hadoop/ivysettings.xml -dependency
some-org some-artifact 102

If I run the spark-shell command in verbose mode, I can see:

Using properties file: /usr/lib/spark/conf/spark-defaults.conf
[...]
Spark properties used, including those specified through
 --conf and those from the properties file
/usr/lib/spark/conf/spark-defaults.conf:
[...]
*(spark.jars.ivySettings,/home/hadoop/ivysettings.xml)*

Ivy Default Cache set to: /home/hadoop/.ivy2/cache
The jars for the packages stored in: /home/hadoop/.ivy2/jars
*:: loading settings :: url =
jar:file:/usr/lib/spark/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml*
org.apache.spark#spark-streaming-kafka-0-10_2.11 added as a dependency
some-org#some-artifact added as a dependency
[...]

So, why is then Spark loading the default settings if I am to provide my
own and it seems to pick the config property correctly?

Thanks!!

Bruno