You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@spark.apache.org by gu...@apache.org on 2022/06/09 05:27:26 UTC
[spark] branch branch-3.3 updated: [SPARK-39421][PYTHON][DOCS] Pin the docutils version <0.18 in documentation build
This is an automated email from the ASF dual-hosted git repository.
gurwls223 pushed a commit to branch branch-3.3
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/branch-3.3 by this push:
new 66826567fa1 [SPARK-39421][PYTHON][DOCS] Pin the docutils version <0.18 in documentation build
66826567fa1 is described below
commit 66826567fa12e57119acc97f9971e36fe834df21
Author: Hyukjin Kwon <gu...@gmail.com>
AuthorDate: Thu Jun 9 14:26:45 2022 +0900
[SPARK-39421][PYTHON][DOCS] Pin the docutils version <0.18 in documentation build
### What changes were proposed in this pull request?
This PR fixes the Sphinx build failure below (see https://github.com/singhpk234/spark/runs/6799026458?check_suite_focus=true):
```
Moving to python/docs directory and building sphinx.
Running Sphinx v3.0.4
WARNING:root:'PYARROW_IGNORE_TIMEZONE' environment variable was not set. It is required to set this environment variable to '1' in both driver and executor sides if you use pyarrow>=2.0.0. pandas-on-Spark will set it for you but it does not work if there is a Spark context already launched.
/__w/spark/spark/python/pyspark/pandas/supported_api_gen.py:101: UserWarning: Warning: Latest version of pandas(>=1.4.0) is required to generate the documentation; however, your version was 1.3.5
warnings.warn(
Warning, treated as error:
node class 'meta' is already registered, its visitors will be overridden
make: *** [Makefile:35: html] Error 2
------------------------------------------------
Jekyll 4.2.1 Please append `--trace` to the `build` command
for any additional information or backtrace.
------------------------------------------------
```
Sphinx build fails apparently with the latest docutils (see also https://issues.apache.org/jira/browse/FLINK-24662). we should pin the version.
### Why are the changes needed?
To recover the CI.
### Does this PR introduce _any_ user-facing change?
No, dev-only.
### How was this patch tested?
CI in this PR should test it out.
Closes #36813 from HyukjinKwon/SPARK-39421.
Lead-authored-by: Hyukjin Kwon <gu...@gmail.com>
Co-authored-by: Hyukjin Kwon <gu...@apache.org>
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
(cherry picked from commit c196ff4dfa1d9f1a8e20b884ee5b4a4e6e65a6e3)
Signed-off-by: Hyukjin Kwon <gu...@apache.org>
---
.github/workflows/build_and_test.yml | 1 +
dev/requirements.txt | 1 +
2 files changed, 2 insertions(+)
diff --git a/.github/workflows/build_and_test.yml b/.github/workflows/build_and_test.yml
index 1f5df70cde9..e0e9f70556c 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -528,6 +528,7 @@ jobs:
python3.9 -m pip install 'sphinx<3.1.0' mkdocs pydata_sphinx_theme ipython nbsphinx numpydoc 'jinja2<3.0.0' 'markupsafe==2.0.1'
python3.9 -m pip install ipython_genutils # See SPARK-38517
python3.9 -m pip install sphinx_plotly_directive 'numpy>=1.20.0' pyarrow pandas 'plotly>=4.8'
+ python3.9 -m pip install 'docutils<0.18.0' # See SPARK-39421
apt-get update -y
apt-get install -y ruby ruby-dev
Rscript -e "install.packages(c('devtools', 'testthat', 'knitr', 'rmarkdown', 'markdown', 'e1071', 'roxygen2'), repos='https://cloud.r-project.org/')"
diff --git a/dev/requirements.txt b/dev/requirements.txt
index 22e72d55543..e7e0a4b4274 100644
--- a/dev/requirements.txt
+++ b/dev/requirements.txt
@@ -35,6 +35,7 @@ numpydoc
jinja2<3.0.0
sphinx<3.1.0
sphinx-plotly-directive
+docutils<0.18.0
# Development scripts
jira
---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscribe@spark.apache.org
For additional commands, e-mail: commits-help@spark.apache.org