You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by "ueshin (via GitHub)" <gi...@apache.org> on 2024/02/22 22:20:38 UTC

[PR] [SPARK-47137][PYTHON][CONNECT] Add getAll to spark.conf for feature parity with Scala [spark]

ueshin opened a new pull request, #45222:
URL: https://github.com/apache/spark/pull/45222

   ### What changes were proposed in this pull request?
   
   Adds `getAll` to `spark.conf` for feature parity with Scala.
   
   ```py
   >>> spark.conf.getAll
   {'spark.sql.warehouse.dir': ...}
   ```
   
   ### Why are the changes needed?
   
   Scala API provides `spark.conf.getAll`; whereas Python doesn't.
   
   ```scala
   scala> spark.conf.getAll
   val res0: Map[String,String] = HashMap(spark.sql.warehouse.dir -> ...
   ```
   
   ### Does this PR introduce _any_ user-facing change?
   
   Yes, `spark.conf.getAll` will be available in PySpark.
   
   ### How was this patch tested?
   
   Added the related tests.
   
   ### Was this patch authored or co-authored using generative AI tooling?
   
   No.
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-47137][PYTHON][CONNECT] Add getAll to spark.conf for feature parity with Scala [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun closed pull request #45222: [SPARK-47137][PYTHON][CONNECT] Add getAll to spark.conf for feature parity with Scala
URL: https://github.com/apache/spark/pull/45222


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-47137][PYTHON][CONNECT] Add getAll to spark.conf for feature parity with Scala [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on PR #45222:
URL: https://github.com/apache/spark/pull/45222#issuecomment-1960714342

   Thank you, @ueshin and @HyukjinKwon .


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org


Re: [PR] [SPARK-47137][PYTHON][CONNECT] Add getAll to spark.conf for feature parity with Scala [spark]

Posted by "dongjoon-hyun (via GitHub)" <gi...@apache.org>.
dongjoon-hyun commented on PR #45222:
URL: https://github.com/apache/spark/pull/45222#issuecomment-1960714267

   Merged to master for Apache Spark 4.0.0.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org