You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Hyukjin Kwon (Jira)" <ji...@apache.org> on 2022/06/09 00:54:00 UTC

[jira] [Created] (SPARK-39421) Sphinx build fails with "node class 'meta' is already registered, its visitors will be overridden"

Hyukjin Kwon created SPARK-39421:
------------------------------------

             Summary: Sphinx build fails with "node class 'meta' is already registered, its visitors will be overridden"
                 Key: SPARK-39421
                 URL: https://issues.apache.org/jira/browse/SPARK-39421
             Project: Spark
          Issue Type: Bug
          Components: Documentation
    Affects Versions: 3.4.0
         Environment: {code}
Moving to python/docs directory and building sphinx.
Running Sphinx v3.0.4
WARNING:root:'PYARROW_IGNORE_TIMEZONE' environment variable was not set. It is required to set this environment variable to '1' in both driver and executor sides if you use pyarrow>=2.0.0. pandas-on-Spark will set it for you but it does not work if there is a Spark context already launched.
/__w/spark/spark/python/pyspark/pandas/supported_api_gen.py:101: UserWarning: Warning: Latest version of pandas(>=1.4.0) is required to generate the documentation; however, your version was 1.3.5
  warnings.warn(
Warning, treated as error:
node class 'meta' is already registered, its visitors will be overridden
make: *** [Makefile:35: html] Error 2
                    ------------------------------------------------
      Jekyll 4.2.1   Please append `--trace` to the `build` command 
                     for any additional information or backtrace. 
                    ------------------------------------------------
{code}

Sphinx build fails apparently with the latest docutils (see also https://issues.apache.org/jira/browse/FLINK-24662). we should pin the version.
            Reporter: Hyukjin Kwon






--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org