You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2021/02/16 01:26:10 UTC
[spark] branch branch-3.1 updated: [SPARK-33434][PYTHON][DOCS]
Added RuntimeConfig to PySpark docs
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.1
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.1 by this push:
new a75dc10 [SPARK-33434][PYTHON][DOCS] Added RuntimeConfig to PySpark docs
a75dc10 is described below
commit a75dc104623127f362decacd8e1c53e7dc998612
Author: Eric Lemmon <er...@lemmon.cc>
AuthorDate: Sat Feb 13 09:32:55 2021 -0600
[SPARK-33434][PYTHON][DOCS] Added RuntimeConfig to PySpark docs
Documentation for `SparkSession.conf.isModifiable` is missing from the Python API site, so we added a Configuration section to the Spark SQL page to expose docs for the `RuntimeConfig` class (the class containing `isModifiable`). Then a `:class:` reference to `RuntimeConfig` was added to the `SparkSession.conf` docstring to create a link there as well.
No docs were generated for `pyspark.sql.conf.RuntimeConfig`.
Yes--a new Configuration section to the Spark SQL page and a `Returns` section of the `SparkSession.conf` docstring, so this will now show a link to the `pyspark.sql.conf.RuntimeConfig` page. This is a change compared to both the released Spark version and the unreleased master branch.
First built the Python docs:
```bash
cd $SPARK_HOME/docs
SKIP_SCALADOC=1 SKIP_RDOC=1 SKIP_SQLDOC=1 jekyll serve
```
Then verified all pages and links:
1. Configuration link displayed on the API Reference page, and it clicks through to Spark SQL page:
http://localhost:4000/api/python/reference/index.html
![image](https://user-images.githubusercontent.com/1160861/107601918-a2f02380-6bed-11eb-9b8f-974a0681a2a9.png)
2. Configuration section displayed on the Spark SQL page, and the RuntimeConfig link clicks through to the RuntimeConfig page:
http://localhost:4000/api/python/reference/pyspark.sql.html#configuration
![image](https://user-images.githubusercontent.com/1160861/107602058-0d08c880-6bee-11eb-8cbb-ad8c47588085.png)**
3. RuntimeConfig page displayed:
http://localhost:4000/api/python/reference/api/pyspark.sql.conf.RuntimeConfig.html
![image](https://user-images.githubusercontent.com/1160861/107602278-94eed280-6bee-11eb-95fc-445ea62ac1a4.png)
4. SparkSession.conf page displays the RuntimeConfig link, and it navigates to the RuntimeConfig page:
http://localhost:4000/api/python/reference/api/pyspark.sql.SparkSession.conf.html
![image](https://user-images.githubusercontent.com/1160861/107602435-1f373680-6bef-11eb-985a-b72432464940.png)
Closes #31483 from Eric-Lemmon/SPARK-33434-document-isModifiable.
Authored-by: Eric Lemmon <er...@lemmon.cc>
Signed-off-by: Sean Owen <sr...@gmail.com>
(cherry picked from commit e3b6e4ad435b31aabd6781df63c50c1f92bdfeac)
Signed-off-by: HyukjinKwon <gu...@apache.org>
---
python/docs/source/reference/pyspark.sql.rst | 11 +++++++++++
python/pyspark/sql/session.py | 4 ++++
2 files changed, 15 insertions(+)
diff --git a/python/docs/source/reference/pyspark.sql.rst b/python/docs/source/reference/pyspark.sql.rst
index 0dc2f6e..bb2af34 100644
--- a/python/docs/source/reference/pyspark.sql.rst
+++ b/python/docs/source/reference/pyspark.sql.rst
@@ -73,6 +73,17 @@ See also :class:`SparkSession`.
SparkSession.version
+Configuration
+-------------
+
+.. currentmodule:: pyspark.sql.conf
+
+.. autosummary::
+ :toctree: api/
+
+ RuntimeConfig
+
+
Input and Output
----------------
diff --git a/python/pyspark/sql/session.py b/python/pyspark/sql/session.py
index 5d0aad3..4db958a 100644
--- a/python/pyspark/sql/session.py
+++ b/python/pyspark/sql/session.py
@@ -338,6 +338,10 @@ class SparkSession(SparkConversionMixin):
This is the interface through which the user can get and set all Spark and Hadoop
configurations that are relevant to Spark SQL. When getting the value of a config,
this defaults to the value set in the underlying :class:`SparkContext`, if any.
+
+ Returns
+ -------
+ :class:`pyspark.sql.conf.RuntimeConfig`
"""
if not hasattr(self, "_conf"):
self._conf = RuntimeConfig(self._jsparkSession.conf())
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org