You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "David Toneian (Jira)" <ji...@apache.org> on 2020/02/13 21:27:00 UTC
[jira] [Updated] (SPARK-30823) %PYTHONPATH% not set in
python/docs/make2.bat, resulting in failed/wrong documentation builds
[ https://issues.apache.org/jira/browse/SPARK-30823?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
David Toneian updated SPARK-30823:
----------------------------------
Description:
When building the PySpark documentation on Windows, by changing directory to {{python/docs}} and running {{make.bat}} (which runs {{make2.bat}}), the majority of the documentation may not be built if {{pyspark}} is not in the default {{%PYTHONPATH%}}. Sphinx then reports that {{pyspark}} (and possibly dependencies) cannot be imported.
If {{pyspark}} is in the default {{%PYTHONPATH%}}, I suppose it is that version of {{pyspark}} – as opposed to the version found above the {{python/docs}} directory – that is considered when building the documentation, which may result in documentation that does not correspond to the development version one is trying to build.
{{python/docs/Makefile}} avoids this issue by setting
??export PYTHONPATH=$(realpath ..):$(realpath ../lib/py4j-0.10.8.1-src.zip)??
on line 10, but {{make2.bat}} does no such thing. The fix consist of adding
??set PYTHONPATH=..;..\lib\py4j-0.10.8.1-src.zip??
to {{make2.bat}}.
See [GitHub PR #27569|https://github.com/apache/spark/pull/27569].
was:
When building the PySpark documentation on Windows, by changing directory to {{python/docs}} and running {{make.bat}} (which runs {{make2.bat}}), the majority of the documentation may not be built if {{pyspark}} is not in the default {{%PYTHONPATH%}}. Sphinx then reports that {{pyspark}} (and possibly dependencies) cannot be imported.
If {{pyspark}} is in the default {{%PYTHONPATH%}}, I suppose it is that version of {{pyspark}} – as opposed to the version found above the {{python/docs}} directory – that is considered when building the documentation, which may result in documentation that does not correspond to the development version one is trying to build.
{{python/docs/Makefile}} avoids this issue by setting
??export PYTHONPATH=$(realpath ..):$(realpath ../lib/py4j-0.10.8.1-src.zip)??
on line 10, but {{make2.bat}} does no such thing. The fix consist of adding
??set PYTHONPATH=..;..\lib\py4j-0.10.8.1-src.zip??
to {{make2.bat}}.
I will open a GitHub PR shortly.
> %PYTHONPATH% not set in python/docs/make2.bat, resulting in failed/wrong documentation builds
> ---------------------------------------------------------------------------------------------
>
> Key: SPARK-30823
> URL: https://issues.apache.org/jira/browse/SPARK-30823
> Project: Spark
> Issue Type: Bug
> Components: Documentation, PySpark, Windows
> Affects Versions: 2.4.5
> Environment: Tested on Windows 10.
> Reporter: David Toneian
> Priority: Minor
>
> When building the PySpark documentation on Windows, by changing directory to {{python/docs}} and running {{make.bat}} (which runs {{make2.bat}}), the majority of the documentation may not be built if {{pyspark}} is not in the default {{%PYTHONPATH%}}. Sphinx then reports that {{pyspark}} (and possibly dependencies) cannot be imported.
> If {{pyspark}} is in the default {{%PYTHONPATH%}}, I suppose it is that version of {{pyspark}} – as opposed to the version found above the {{python/docs}} directory – that is considered when building the documentation, which may result in documentation that does not correspond to the development version one is trying to build.
> {{python/docs/Makefile}} avoids this issue by setting
> ??export PYTHONPATH=$(realpath ..):$(realpath ../lib/py4j-0.10.8.1-src.zip)??
> on line 10, but {{make2.bat}} does no such thing. The fix consist of adding
> ??set PYTHONPATH=..;..\lib\py4j-0.10.8.1-src.zip??
> to {{make2.bat}}.
> See [GitHub PR #27569|https://github.com/apache/spark/pull/27569].
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org