You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@kyuubi.apache.org by zh...@apache.org on 2022/07/29 04:34:00 UTC
[incubator-kyuubi] branch master updated: [KYUUBI #3160][DOCS] `Dependencies` links in `Connectors for Spark SQL Query Engine` pages jump to wrong place #3160 (#3161)
This is an automated email from the ASF dual-hosted git repository.
zhaomin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/incubator-kyuubi.git
The following commit(s) were added to refs/heads/master by this push:
new 3729a9982 [KYUUBI #3160][DOCS] `Dependencies` links in `Connectors for Spark SQL Query Engine` pages jump to wrong place #3160 (#3161)
3729a9982 is described below
commit 3729a9982caa3a5eb83d84880ba6a464150afc9e
Author: zhouyifan279 <88...@users.noreply.github.com>
AuthorDate: Fri Jul 29 12:33:55 2022 +0800
[KYUUBI #3160][DOCS] `Dependencies` links in `Connectors for Spark SQL Query Engine` pages jump to wrong place #3160 (#3161)
Signed-off-by: zhaomin1423 <zh...@163.com>
---
docs/connector/flink/iceberg.rst | 4 ++--
docs/connector/spark/delta_lake.rst | 8 ++++----
docs/connector/spark/flink_table_store.rst | 8 ++++----
docs/connector/spark/hudi.rst | 8 ++++----
docs/connector/spark/iceberg.rst | 8 ++++----
docs/connector/spark/tidb.rst | 8 ++++----
docs/connector/spark/tpch.rst | 4 ++--
docs/connector/spark/tpdcs.rst | 4 ++--
8 files changed, 26 insertions(+), 26 deletions(-)
diff --git a/docs/connector/flink/iceberg.rst b/docs/connector/flink/iceberg.rst
index 3d387a52d..5fc1974a7 100644
--- a/docs/connector/flink/iceberg.rst
+++ b/docs/connector/flink/iceberg.rst
@@ -34,9 +34,9 @@ Iceberg Integration
To enable the integration of kyuubi flink sql engine and Iceberg through Catalog APIs, you need to:
-- Referencing the Iceberg :ref:`dependencies`
+- Referencing the Iceberg :ref:`dependencies<flink-iceberg-deps>`
-.. _dependencies:
+.. _flink-iceberg-deps:
Dependencies
************
diff --git a/docs/connector/spark/delta_lake.rst b/docs/connector/spark/delta_lake.rst
index 3fed1a62d..7a162ad93 100644
--- a/docs/connector/spark/delta_lake.rst
+++ b/docs/connector/spark/delta_lake.rst
@@ -36,10 +36,10 @@ Delta Lake Integration
To enable the integration of kyuubi spark sql engine and delta lake through
Apache Spark Datasource V2 and Catalog APIs, you need to:
-- Referencing the delta lake :ref:`dependencies`
-- Setting the spark extension and catalog :ref:`configurations`
+- Referencing the delta lake :ref:`dependencies<spark-delta-lake-deps>`
+- Setting the spark extension and catalog :ref:`configurations<spark-delta-lake-conf>`
-.. _dependencies:
+.. _spark-delta-lake-deps:
Dependencies
************
@@ -58,7 +58,7 @@ In order to make the delta packages visible for the runtime classpath of engines
.. warning::
Please mind the compatibility of different Delta Lake and Spark versions, which can be confirmed on the page of `delta release notes`_.
-.. _configurations:
+.. _spark-delta-lake-conf:
Configurations
**************
diff --git a/docs/connector/spark/flink_table_store.rst b/docs/connector/spark/flink_table_store.rst
index 3ce03f4ee..2a221c903 100644
--- a/docs/connector/spark/flink_table_store.rst
+++ b/docs/connector/spark/flink_table_store.rst
@@ -34,10 +34,10 @@ Flink Table Store Integration
To enable the integration of kyuubi spark sql engine and Flink Table Store through
Apache Spark Datasource V2 and Catalog APIs, you need to:
-- Referencing the Flink Table Store :ref:`dependencies`
-- Setting the spark extension and catalog :ref:`configurations`
+- Referencing the Flink Table Store :ref:`dependencies<spark-flink-table-store-deps>`
+- Setting the spark extension and catalog :ref:`configurations<spark-flink-table-store-conf>`
-.. _dependencies:
+.. _spark-flink-table-store-deps:
Dependencies
************
@@ -56,7 +56,7 @@ In order to make the Flink Table Store packages visible for the runtime classpat
.. warning::
Please mind the compatibility of different Flink Table Store and Spark versions, which can be confirmed on the page of `Flink Table Store multi engine support`_.
-.. _configurations:
+.. _spark-flink-table-store-conf:
Configurations
**************
diff --git a/docs/connector/spark/hudi.rst b/docs/connector/spark/hudi.rst
index ac3789d0a..3c50720d3 100644
--- a/docs/connector/spark/hudi.rst
+++ b/docs/connector/spark/hudi.rst
@@ -33,10 +33,10 @@ Hudi Integration
To enable the integration of kyuubi spark sql engine and Hudi through
Catalog APIs, you need to:
-- Referencing the Hudi :ref:`dependencies`
-- Setting the Spark extension and catalog :ref:`configurations`
+- Referencing the Hudi :ref:`dependencies<spark-hudi-deps>`
+- Setting the Spark extension and catalog :ref:`configurations<spark-hudi-conf>`
-.. _dependencies:
+.. _spark-hudi-deps:
Dependencies
************
@@ -52,7 +52,7 @@ In order to make the Hudi packages visible for the runtime classpath of engines,
1. Put the Hudi packages into ``$SPARK_HOME/jars`` directly
2. Set ``spark.jars=/path/to/hudi-spark-bundle``
-.. _configurations:
+.. _spark-hudi-conf:
Configurations
**************
diff --git a/docs/connector/spark/iceberg.rst b/docs/connector/spark/iceberg.rst
index 02959c15d..85a7407be 100644
--- a/docs/connector/spark/iceberg.rst
+++ b/docs/connector/spark/iceberg.rst
@@ -35,10 +35,10 @@ Iceberg Integration
To enable the integration of kyuubi spark sql engine and Iceberg through
Apache Spark Datasource V2 and Catalog APIs, you need to:
-- Referencing the Iceberg :ref:`dependencies`
-- Setting the spark extension and catalog :ref:`configurations`
+- Referencing the Iceberg :ref:`dependencies<spark-iceberg-deps>`
+- Setting the spark extension and catalog :ref:`configurations<spark-iceberg-conf>`
-.. _dependencies:
+.. _spark-iceberg-deps:
Dependencies
************
@@ -57,7 +57,7 @@ In order to make the Iceberg packages visible for the runtime classpath of engin
.. warning::
Please mind the compatibility of different Iceberg and Spark versions, which can be confirmed on the page of `Iceberg multi engine support`_.
-.. _configurations:
+.. _spark-iceberg-conf:
Configurations
**************
diff --git a/docs/connector/spark/tidb.rst b/docs/connector/spark/tidb.rst
index bfda33262..922d08dc6 100644
--- a/docs/connector/spark/tidb.rst
+++ b/docs/connector/spark/tidb.rst
@@ -38,10 +38,10 @@ TiDB Integration
To enable the integration of kyuubi spark sql engine and TiDB through
Apache Spark Datasource V2 and Catalog APIs, you need to:
-- Referencing the TiSpark :ref:`dependencies`
-- Setting the spark extension and catalog :ref:`configurations`
+- Referencing the TiSpark :ref:`dependencies<spark-tidb-deps>`
+- Setting the spark extension and catalog :ref:`configurations<spark-tidb-conf>`
-.. _dependencies:
+.. _spark-tidb-deps:
Dependencies
************
@@ -59,7 +59,7 @@ In order to make the TiSpark packages visible for the runtime classpath of engin
.. warning::
Please mind the compatibility of different TiDB, TiSpark and Spark versions, which can be confirmed on the page of `TiSpark Environment setup`_.
-.. _configurations:
+.. _spark-tidb-conf:
Configurations
**************
diff --git a/docs/connector/spark/tpch.rst b/docs/connector/spark/tpch.rst
index 31ae5c92a..29dc91a40 100644
--- a/docs/connector/spark/tpch.rst
+++ b/docs/connector/spark/tpch.rst
@@ -19,12 +19,12 @@ TPC-H
TPC-DS Integration
------------------
-.. _dependencies:
+.. _spark-tpch-deps:
Dependencies
************
-.. _configurations:
+.. _spark-tpch-conf:
Configurations
**************
diff --git a/docs/connector/spark/tpdcs.rst b/docs/connector/spark/tpdcs.rst
index ed110ae8f..58d83b1ab 100644
--- a/docs/connector/spark/tpdcs.rst
+++ b/docs/connector/spark/tpdcs.rst
@@ -19,12 +19,12 @@ TPC-DS
TPC-DS Integration
-------------------
-.. _dependencies:
+.. _spark-tpcds-deps:
Dependencies
************
-.. _configurations:
+.. _spark-tpcds-conf:
Configurations
**************