You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Xianjin YE (JIRA)" <ji...@apache.org> on 2019/06/28 10:28:00 UTC

[jira] [Created] (SPARK-28203) PythonRDD should respect SparkContext's conf when passing user confMap

Xianjin YE created SPARK-28203:
----------------------------------

             Summary: PythonRDD should respect SparkContext's conf when passing user confMap
                 Key: SPARK-28203
                 URL: https://issues.apache.org/jira/browse/SPARK-28203
             Project: Spark
          Issue Type: Bug
          Components: PySpark, Spark Core
    Affects Versions: 2.4.3
            Reporter: Xianjin YE


PythonRDD have several API which accepts user configs from python side. The parameter is called confAsMap and it's intended to merge with RDD's hadoop configuration.


 However, the confAsMap is first mapped to Configuration then merged into SparkContext's hadoop configuration. The mapped Configuration will load default key values in core-default.xml etc., which may be updated in SparkContext's hadoop configuration. The default value will override updated value in the merge process.

I will submit a pr to fix this.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org