You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Akhil Das <ak...@sigmoidanalytics.com> on 2015/07/01 08:27:44 UTC

Re: Difference between spark-defaults.conf and SparkConf.set

.addJar works for me when i run it as a stand-alone application (without
using spark-submit)

Thanks
Best Regards

On Tue, Jun 30, 2015 at 7:47 PM, Yana Kadiyska <ya...@gmail.com>
wrote:

> Hi folks, running into a pretty strange issue:
>
> I'm setting
> spark.executor.extraClassPath
> spark.driver.extraClassPath
>
> to point to some external JARs. If I set them in spark-defaults.conf
> everything works perfectly.
> However, if I remove spark-defaults.conf and just create a SparkConf and
> call
> .set("spark.executor.extraClassPath","...)
> .set("spark.driver.extraClassPath",...)
>
> I get ClassNotFound exceptions from Hadoop Conf:
>
> Caused by: java.lang.ClassNotFoundException: Class org.apache.hadoop.fs.ceph.CephFileSystem not found
>         at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1493)
>         at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1585)
>
> ​
>
> This seems like a bug to me -- or does spark-defaults.conf somehow get
> processed differently?
>
> I have dumped out sparkConf.toDebugString and in both cases
> (spark-defaults.conf/in code sets) it seems to have the same values in it...
>