You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Irina Truong (JIRA)" <ji...@apache.org> on 2017/06/07 15:24:18 UTC

[jira] [Commented] (SPARK-19307) SPARK-17387 caused ignorance of conf object passed to SparkContext:

    [ https://issues.apache.org/jira/browse/SPARK-19307?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16041040#comment-16041040 ] 

Irina Truong commented on SPARK-19307:
--------------------------------------

Is this available in 2.1.1? I could not find it in release notes.

> SPARK-17387 caused ignorance of conf object passed to SparkContext:
> -------------------------------------------------------------------
>
>                 Key: SPARK-19307
>                 URL: https://issues.apache.org/jira/browse/SPARK-19307
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.1.0
>            Reporter: yuriy_hupalo
>            Assignee: Marcelo Vanzin
>         Attachments: SPARK-19307.patch
>
>
> after patch SPARK-17387 was applied -- Sparkconf object is ignored when launching SparkContext programmatically via python from spark-submit:
> https://github.com/apache/spark/blob/master/python/pyspark/context.py#L128:
> in case when we are running python SparkContext(conf=xxx) from spark-submit:
>     conf is set, conf._jconf is None ()
>     passed as arg  conf object is ignored (and used only when we are launching java_gateway).
> how to fix:
> python/pyspark/context.py:132
> {code:title=python/pyspark/context.py:132}
>         if conf is not None and conf._jconf is not None:
>             # conf has been initialized in JVM properly, so use conf directly. This represent the
>             # scenario that JVM has been launched before SparkConf is created (e.g. SparkContext is
>             # created and then stopped, and we create a new SparkConf and new SparkContext again)
>             self._conf = conf
>         else:
>             self._conf = SparkConf(_jvm=SparkContext._jvm)
> +             if conf:
> +                 for key, value in conf.getAll():
> +                     self._conf.set(key,value)
> +                     print(key,value)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org