You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@spark.apache.org by Nathan McLean <nm...@kinsolresearch.com> on 2017/10/13 20:32:50 UTC

Bug Report: Spark Config Defaults not Loading with python code/spark-submit

Here is an example pyspark program which illustrates this problem. If run
using spark-submit, the default configurations for Spark do not seem to be
loaded when a new SparkConf class is instantiated (contrary to what the
loadDefaults=True keyword arg implies). When using the interactive shell
for pyspark, the default configurations are loaded as expected.

I also tested this behaviour with a simple Scala program, but the Scala
code behaved correctly.


#!/usr/bin/python2.7
from pyspark.conf import SparkConf
from pyspark import SparkContext


# this prints nothing
print 'SPARK DEFAULTS'
conf = SparkConf()
print conf.toDebugString()

# this prints configuration options
print 'SPARK DEFAULTS'
spark_context = SparkContext()
conf = spark_context.getConf()
print conf.toDebugString()


This bug does not seem to exist in Spark 1.6.x
I have reproduced it in Spark 2.1.1 and Spark 2.2.0