You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@spark.apache.org by GitBox <gi...@apache.org> on 2020/11/23 12:29:07 UTC

[GitHub] [spark] gengliangwang commented on a change in pull request #30469: [SPARK-33479][DOC][FollowUp] DocSearch: Support filtering search results by version

gengliangwang commented on a change in pull request #30469:
URL: https://github.com/apache/spark/pull/30469#discussion_r528668398



##########
File path: docs/_config.yml
##########
@@ -26,15 +26,20 @@ SCALA_VERSION: "2.12.10"
 MESOS_VERSION: 1.0.0
 SPARK_ISSUE_TRACKER_URL: https://issues.apache.org/jira/browse/SPARK
 SPARK_GITHUB_URL: https://github.com/apache/spark
-# Before a new release, we should apply a new `apiKey` for the new Spark documentation
-# on https://docsearch.algolia.com/. Otherwise, after release, the search results are always based
-# on the latest documentation(https://spark.apache.org/docs/latest/) even when visiting the
-# documentation of previous releases.
+# Before a new release, we should:
+#   1. update the `version` array for the new Spark documentation
+#      on https://github.com/algolia/docsearch-configs/blob/master/configs/apache_spark.json.
+#   2. update the value of `facetFilters.version` in `algoliaOptions` on the new release branch.

Review comment:
       An alternative way is always to update the current spark doc. For example, if we are going to release 3.1.1 and the current doc version is 3.1.0, we can update the version of http://spark.apache.org/docs/3.1.0 from `latest` to `3.1.0`. For 3.1.1, we can still use the version `latest` until the next relealse.
   




----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscribe@spark.apache.org
For additional commands, e-mail: reviews-help@spark.apache.org