You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Sidney Feiner (JIRA)" <ji...@apache.org> on 2017/01/27 12:35:24 UTC

[jira] [Commented] (SPARK-19369) SparkConf not getting properly initialized in PySpark 2.1.0

    [ https://issues.apache.org/jira/browse/SPARK-19369?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15842726#comment-15842726 ] 

Sidney Feiner commented on SPARK-19369:
---------------------------------------

Ok, thanks :) When is 2.1.1 estimated to be released?

> SparkConf not getting properly initialized in PySpark 2.1.0
> -----------------------------------------------------------
>
>                 Key: SPARK-19369
>                 URL: https://issues.apache.org/jira/browse/SPARK-19369
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.1.0
>         Environment: Windows/Linux
>            Reporter: Sidney Feiner
>              Labels: configurations, context, pyspark
>
> Trying to migrate from Spark 1.6 to 2.1, I've stumbled upon a small problem - my SparkContext doesn't get its configurations from the SparkConf object. Before passing them onto to the SparkContext constructor, I've made sure my configuration are set.
> I've done some digging and this is what I've found:
> When I initialize the SparkContext, the following code is executed:
> def _do_init(self, master, appName, sparkHome, pyFiles, environment, batchSize, serializer,
>              conf, jsc, profiler_cls):
>     self.environment = environment or {}
>     if conf is not None and conf._jconf is not None:
>        self._conf = conf
>     else:
>         self._conf = SparkConf(_jvm=SparkContext._jvm)
> So I can see that the only way that my SparkConf will be used is if it also has a _jvm object.
> I've used spark-submit to submit my job and printed the _jvm object but it is null, which explains why my SparkConf object is ignored.
> I've tried running exactly the same on Spark 2.0.1 and it worked! My SparkConf object had a valid _jvm object.
> Am i doing something wrong or is this a bug?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org