You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "yuriy_hupalo (JIRA)" <ji...@apache.org> on 2017/01/20 08:01:26 UTC

[jira] [Updated] (SPARK-19307) SPARK-17387 caused ignorance of conf object passed to SparkContext:

     [ https://issues.apache.org/jira/browse/SPARK-19307?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

yuriy_hupalo updated SPARK-19307:
---------------------------------
    Description: 
after patch SPARK-17387 was applied -- Sparkconf object is ignored when launching SparkContext programmatically via python from spark-submit:

https://github.com/apache/spark/blob/master/python/pyspark/context.py#L128:

in case when we are running python SparkContext(conf=xxx) from spark-submit:
    conf is set, conf._jconf is None ()

    passed as arg  conf object is ignored (and used only when we are launching java_gateway).

how to fix:

python/pyspark/context.py:132

{code:title=python/pyspark/context.py:132}
        if conf is not None and conf._jconf is not None:
            # conf has been initialized in JVM properly, so use conf directly. This represent the
            # scenario that JVM has been launched before SparkConf is created (e.g. SparkContext is
            # created and then stopped, and we create a new SparkConf and new SparkContext again)
            self._conf = conf
        else:
            self._conf = SparkConf(_jvm=SparkContext._jvm)
+             if conf:
+                 for key, value in conf.getAll():
+                     self._conf.set(key,value)
+                     print(key,value)
{code}




  was:
after patch SPARK-17387 was applied -- Sparkconf object is ignored when launching SparkContext programmatically via python from spark-submit:

https://github.com/apache/spark/blob/master/python/pyspark/context.py#L128:

in case when we are running python SparkContext(conf=xxx) from spark-submit:
    conf is set, conf._jconf is None ()

    passed as arg  conf object is ignored (and used only when we are launching java_gateway).

how to fix:
python/pyspark/context.py:132
===================
        if conf is not None and conf._jconf is not None:
            # conf has been initialized in JVM properly, so use conf directly. This represent the
            # scenario that JVM has been launched before SparkConf is created (e.g. SparkContext is
            # created and then stopped, and we create a new SparkConf and new SparkContext again)
            self._conf = conf
        else:
            self._conf = SparkConf(_jvm=SparkContext._jvm)
+             if conf:
+                 for key, value in conf.getAll():
+                     self._conf.set(key,value)
+                     print(key,value)
===================




> SPARK-17387 caused ignorance of conf object passed to SparkContext:
> -------------------------------------------------------------------
>
>                 Key: SPARK-19307
>                 URL: https://issues.apache.org/jira/browse/SPARK-19307
>             Project: Spark
>          Issue Type: Bug
>          Components: PySpark
>    Affects Versions: 2.1.0
>            Reporter: yuriy_hupalo
>
> after patch SPARK-17387 was applied -- Sparkconf object is ignored when launching SparkContext programmatically via python from spark-submit:
> https://github.com/apache/spark/blob/master/python/pyspark/context.py#L128:
> in case when we are running python SparkContext(conf=xxx) from spark-submit:
>     conf is set, conf._jconf is None ()
>     passed as arg  conf object is ignored (and used only when we are launching java_gateway).
> how to fix:
> python/pyspark/context.py:132
> {code:title=python/pyspark/context.py:132}
>         if conf is not None and conf._jconf is not None:
>             # conf has been initialized in JVM properly, so use conf directly. This represent the
>             # scenario that JVM has been launched before SparkConf is created (e.g. SparkContext is
>             # created and then stopped, and we create a new SparkConf and new SparkContext again)
>             self._conf = conf
>         else:
>             self._conf = SparkConf(_jvm=SparkContext._jvm)
> +             if conf:
> +                 for key, value in conf.getAll():
> +                     self._conf.set(key,value)
> +                     print(key,value)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org