You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by ja...@apache.org on 2022/09/20 01:45:13 UTC

[flink] 01/25: [FLINK-29025][docs] add overview page for Hive dialect

This is an automated email from the ASF dual-hosted git repository.

jark pushed a commit to branch release-1.16
in repository https://gitbox.apache.org/repos/asf/flink.git

commit 53194166ee70c71aae55bedf1030f44aaf50ca3d
Author: luoyuxia <lu...@alumni.sjtu.edu.cn>
AuthorDate: Mon Aug 29 14:51:11 2022 +0800

    [FLINK-29025][docs] add overview page for Hive dialect
---
 .../docs/connectors/table/hive/hive_catalog.md     |   2 +-
 .../docs/connectors/table/hive/hive_dialect.md     | 421 --------------------
 .../docs/connectors/table/hive/hive_functions.md   |  10 +-
 .../docs/connectors/table/hive/hive_read_write.md  |   2 +-
 .../docs/connectors/table/hive/overview.md         |   2 +-
 .../docs/dev/table/hiveCompatibility/_index.md     |   4 +-
 .../hiveCompatibility/{ => hiveDialect}/_index.md  |   6 +-
 .../hiveCompatibility/hiveDialect/overview.md      |  93 +++++
 .../docs/connectors/table/hive/hive_catalog.md     |   2 +-
 .../docs/connectors/table/hive/hive_dialect.md     | 434 ---------------------
 .../docs/connectors/table/hive/hive_functions.md   |  10 +-
 .../docs/connectors/table/hive/hive_read_write.md  |   2 +-
 .../content/docs/connectors/table/hive/overview.md |   2 +-
 .../docs/dev/table/hiveCompatibility/_index.md     |   2 +-
 .../table/hiveCompatibility/hiveDialect}/_index.md |   6 +-
 .../hiveCompatibility/hiveDialect/overview.md      |  97 +++++
 16 files changed, 209 insertions(+), 886 deletions(-)

diff --git a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
index dc2e461fd7f..44a752573e7 100644
--- a/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content.zh/docs/connectors/table/hive/hive_catalog.md
@@ -64,7 +64,7 @@ Generic tables, on the other hand, are specific to Flink. When creating generic
 HMS to persist the metadata. While these tables are visible to Hive, it's unlikely Hive is able to understand
 the metadata. And therefore using such tables in Hive leads to undefined behavior.
 
-It's recommended to switch to [Hive dialect]({{< ref "docs/connectors/table/hive/hive_dialect" >}}) to create Hive-compatible tables.
+It's recommended to switch to [Hive dialect]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}}) to create Hive-compatible tables.
 If you want to create Hive-compatible tables with default dialect, make sure to set `'connector'='hive'` in your table properties, otherwise
 a table is considered generic by default in `HiveCatalog`. Note that the `connector` property is not required if you use Hive dialect.
 
diff --git a/docs/content.zh/docs/connectors/table/hive/hive_dialect.md b/docs/content.zh/docs/connectors/table/hive/hive_dialect.md
deleted file mode 100644
index d591457ce70..00000000000
--- a/docs/content.zh/docs/connectors/table/hive/hive_dialect.md
+++ /dev/null
@@ -1,421 +0,0 @@
----
-title: "Hive 方言"
-weight: 3
-type: docs
-aliases:
-  - /zh/dev/table/connectors/hive/hive_dialect.html
----
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-  http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# Hive 方言
-
-从 1.11.0 开始,在使用 Hive 方言时,Flink 允许用户用 Hive 语法来编写 SQL 语句。通过提供与 Hive 语法的兼容性,我们旨在改善与 Hive 的互操作性,并减少用户需要在 Flink 和 Hive 之间切换来执行不同语句的情况。
-
-## 使用 Hive 方言
-
-Flink 目前支持两种 SQL 方言: `default` 和 `hive`。你需要先切换到 Hive 方言,然后才能使用 Hive 语法编写。下面介绍如何使用 SQL 客户端和 Table API 设置方言。
-还要注意,你可以为执行的每个语句动态切换方言。无需重新启动会话即可使用其他方言。
-
-### SQL 客户端
-
-SQL 方言可以通过 `table.sql-dialect` 属性指定。因此你可以通过 SQL 客户端 yaml 文件中的 `configuration` 部分来设置初始方言。
-
-```yaml
-
-execution:
-  type: batch
-  result-mode: table
-
-configuration:
-  table.sql-dialect: hive
-
-```
-
-你同样可以在 SQL 客户端启动后设置方言。
-
-```bash
-
-Flink SQL> set table.sql-dialect=hive; -- to use hive dialect
-[INFO] Session property has been set.
-
-Flink SQL> set table.sql-dialect=default; -- to use default dialect
-[INFO] Session property has been set.
-
-```
-
-### Table API
-
-你可以使用 Table API 为 TableEnvironment 设置方言。
-
-{{< tabs "82a7968d-df12-4db2-83ab-16f09b263935" >}}
-{{< tab "Java" >}}
-```java
-
-EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
-TableEnvironment tableEnv = TableEnvironment.create(settings);
-// to use hive dialect
-tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
-// to use default dialect
-tableEnv.getConfig().setSqlDialect(SqlDialect.DEFAULT);
-
-```
-{{< /tab >}}
-{{< tab "Python" >}}
-```python
-from pyflink.table import *
-
-settings = EnvironmentSettings.in_batch_mode()
-t_env = TableEnvironment.create(settings)
-
-# to use hive dialect
-t_env.get_config().set_sql_dialect(SqlDialect.HIVE)
-# to use default dialect
-t_env.get_config().set_sql_dialect(SqlDialect.DEFAULT)
-
-```
-{{< /tab >}}
-{{< /tabs >}}
-
-## DDL
-
-本章节列出了 Hive 方言支持的 DDL 语句。我们主要关注语法。你可以参考 [Hive 文档](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL)
-了解每个 DDL 语句的语义。
-
-### CATALOG
-
-#### Show
-
-```sql
-SHOW CURRENT CATALOG;
-```
-
-### DATABASE
-
-#### Show
-
-```sql
-SHOW DATABASES;
-```
-
-#### Create
-
-```sql
-CREATE (DATABASE|SCHEMA) [IF NOT EXISTS] database_name
-  [COMMENT database_comment]
-  [LOCATION fs_path]
-  [WITH DBPROPERTIES (property_name=property_value, ...)];
-```
-
-#### Alter
-
-##### Update Properties
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET DBPROPERTIES (property_name=property_value, ...);
-```
-
-##### Update Owner
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET OWNER [USER|ROLE] user_or_role;
-```
-
-##### Update Location
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET LOCATION fs_path;
-```
-
-#### Drop
-
-```sql
-DROP (DATABASE|SCHEMA) [IF EXISTS] database_name [RESTRICT|CASCADE];
-```
-
-#### Use
-
-```sql
-USE database_name;
-```
-
-### TABLE
-
-#### Show
-
-```sql
-SHOW TABLES;
-```
-
-#### Create
-
-```sql
-CREATE [EXTERNAL] TABLE [IF NOT EXISTS] table_name
-  [(col_name data_type [column_constraint] [COMMENT col_comment], ... [table_constraint])]
-  [COMMENT table_comment]
-  [PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
-  [
-    [ROW FORMAT row_format]
-    [STORED AS file_format]
-  ]
-  [LOCATION fs_path]
-  [TBLPROPERTIES (property_name=property_value, ...)]
-
-row_format:
-  : DELIMITED [FIELDS TERMINATED BY char [ESCAPED BY char]] [COLLECTION ITEMS TERMINATED BY char]
-      [MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]
-      [NULL DEFINED AS char]
-  | SERDE serde_name [WITH SERDEPROPERTIES (property_name=property_value, ...)]
-
-file_format:
-  : SEQUENCEFILE
-  | TEXTFILE
-  | RCFILE
-  | ORC
-  | PARQUET
-  | AVRO
-  | INPUTFORMAT input_format_classname OUTPUTFORMAT output_format_classname
-
-column_constraint:
-  : NOT NULL [[ENABLE|DISABLE] [VALIDATE|NOVALIDATE] [RELY|NORELY]]
-
-table_constraint:
-  : [CONSTRAINT constraint_name] PRIMARY KEY (col_name, ...) [[ENABLE|DISABLE] [VALIDATE|NOVALIDATE] [RELY|NORELY]]
-```
-
-#### Alter
-
-##### Rename
-
-```sql
-ALTER TABLE table_name RENAME TO new_table_name;
-```
-
-##### Update Properties
-
-```sql
-ALTER TABLE table_name SET TBLPROPERTIES (property_name = property_value, property_name = property_value, ... );
-```
-
-##### Update Location
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET LOCATION fs_path;
-```
-
-如果指定了 `partition_spec`,那么必须完整,即具有所有分区列的值。如果指定了,该操作将作用在对应分区上而不是表上。
-
-##### Update File Format
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET FILEFORMAT file_format;
-```
-
-如果指定了 `partition_spec`,那么必须完整,即具有所有分区列的值。如果指定了,该操作将作用在对应分区上而不是表上。
-
-##### Update SerDe Properties
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET SERDE serde_class_name [WITH SERDEPROPERTIES serde_properties];
-
-ALTER TABLE table_name [PARTITION partition_spec] SET SERDEPROPERTIES serde_properties;
-
-serde_properties:
-  : (property_name = property_value, property_name = property_value, ... )
-```
-
-如果指定了 `partition_spec`,那么必须完整,即具有所有分区列的值。如果指定了,该操作将作用在对应分区上而不是表上。
-
-##### Add Partitions
-
-```sql
-ALTER TABLE table_name ADD [IF NOT EXISTS] (PARTITION partition_spec [LOCATION fs_path])+;
-```
-
-##### Drop Partitions
-
-```sql
-ALTER TABLE table_name DROP [IF EXISTS] PARTITION partition_spec[, PARTITION partition_spec, ...];
-```
-
-##### Add/Replace Columns
-
-```sql
-ALTER TABLE table_name
-  ADD|REPLACE COLUMNS (col_name data_type [COMMENT col_comment], ...)
-  [CASCADE|RESTRICT]
-```
-
-##### Change Column
-
-```sql
-ALTER TABLE table_name CHANGE [COLUMN] col_old_name col_new_name column_type
-  [COMMENT col_comment] [FIRST|AFTER column_name] [CASCADE|RESTRICT];
-```
-
-#### Drop
-
-```sql
-DROP TABLE [IF EXISTS] table_name;
-```
-
-### VIEW
-
-#### Create
-
-```sql
-CREATE VIEW [IF NOT EXISTS] view_name [(column_name, ...) ]
-  [COMMENT view_comment]
-  [TBLPROPERTIES (property_name = property_value, ...)]
-  AS SELECT ...;
-```
-
-#### Alter
-
-**注意**: 变更视图只在 Table API 中有效,SQL 客户端不支持。
-
-##### Rename
-
-```sql
-ALTER VIEW view_name RENAME TO new_view_name;
-```
-
-##### Update Properties
-
-```sql
-ALTER VIEW view_name SET TBLPROPERTIES (property_name = property_value, ... );
-```
-
-##### Update As Select
-
-```sql
-ALTER VIEW view_name AS select_statement;
-```
-
-#### Drop
-
-```sql
-DROP VIEW [IF EXISTS] view_name;
-```
-
-### FUNCTION
-
-#### Show
-
-```sql
-SHOW FUNCTIONS;
-```
-
-#### Create
-
-```sql
-CREATE FUNCTION function_name AS class_name;
-```
-
-#### Drop
-
-```sql
-DROP FUNCTION [IF EXISTS] function_name;
-```
-
-## DML & DQL _`Beta`_
-
-Hive 方言支持常用的 Hive [DML](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML)
-和 [DQL](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Select) 。 下表列出了一些 Hive 方言支持的语法。
-
-- [SORT/CLUSTER/DISTRIBUTE BY](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+SortBy)
-- [Group By](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+GroupBy)
-- [Join](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Joins)
-- [Union](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union)
-- [LATERAL VIEW](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+LateralView)
-- [Window Functions](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+WindowingAndAnalytics)
-- [SubQueries](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+SubQueries)
-- [CTE](https://cwiki.apache.org/confluence/display/Hive/Common+Table+Expression)
-- [INSERT INTO dest schema](https://issues.apache.org/jira/browse/HIVE-9481)
-- [Implicit type conversions](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-AllowedImplicitConversions)
-
-为了实现更好的语法和语义的兼容,强烈建议使用 [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule) 
-并将其放在 Module 列表的首位,以便在函数解析时优先使用 Hive 内置函数。
-
-Hive 方言不再支持 [Flink SQL 语法]({{< ref "docs/dev/table/sql/queries/overview" >}}) 。 若需使用 Flink 语法,请切换到 `default` 方言。
-
-以下是一个使用 Hive 方言的示例。
-
-```bash
-Flink SQL> create catalog myhive with ('type' = 'hive', 'hive-conf-dir' = '/opt/hive-conf');
-[INFO] Execute statement succeed.
-
-Flink SQL> use catalog myhive;
-[INFO] Execute statement succeed.
-
-Flink SQL> load module hive;
-[INFO] Execute statement succeed.
-
-Flink SQL> use modules hive,core;
-[INFO] Execute statement succeed.
-
-Flink SQL> set table.sql-dialect=hive;
-[INFO] Session property has been set.
-
-Flink SQL> select explode(array(1,2,3)); -- call hive udtf
-+-----+
-| col |
-+-----+
-|   1 |
-|   2 |
-|   3 |
-+-----+
-3 rows in set
-
-Flink SQL> create table tbl (key int,value string);
-[INFO] Execute statement succeed.
-
-Flink SQL> insert overwrite table tbl values (5,'e'),(1,'a'),(1,'a'),(3,'c'),(2,'b'),(3,'c'),(3,'c'),(4,'d');
-[INFO] Submitting SQL update statement to the cluster...
-[INFO] SQL update statement has been successfully submitted to the cluster:
-
-Flink SQL> select * from tbl cluster by key; -- run cluster by
-2021-04-22 16:13:57,005 INFO  org.apache.hadoop.mapred.FileInputFormat                     [] - Total input paths to process : 1
-+-----+-------+
-| key | value |
-+-----+-------+
-|   1 |     a |
-|   1 |     a |
-|   5 |     e |
-|   2 |     b |
-|   3 |     c |
-|   3 |     c |
-|   3 |     c |
-|   4 |     d |
-+-----+-------+
-8 rows in set
-```
-
-## 注意
-
-以下是使用 Hive 方言的一些注意事项。
-
-- Hive 方言只能用于操作 Hive 对象,并要求当前 Catalog 是一个 [HiveCatalog]({{< ref "docs/connectors/table/hive/hive_catalog" >}}) 。
-- Hive 方言只支持 `db.table` 这种两级的标识符,不支持带有 Catalog 名字的标识符。
-- 虽然所有 Hive 版本支持相同的语法,但是一些特定的功能是否可用仍取决于你使用的[Hive 版本]({{< ref "docs/connectors/table/hive/overview" >}}#支持的hive版本)。例如,更新数据库位置
- 只在 Hive-2.4.0 或更高版本支持。
-- 执行 DML 和 DQL 时应该使用 [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule) 。
-- 从 Flink 1.15版本开始,在使用 Hive 方言抛出以下异常时,请尝试用 opt 目录下的 flink-table-planner_2.12 jar 包来替换 lib 目录下的 flink-table-planner-loader jar 包。具体原因请参考 [FLINK-25128](https://issues.apache.org/jira/browse/FLINK-25128)。
-  {{<img alt="error" width="80%" src="/fig/hive_parser_load_exception.png">}}
-  
diff --git a/docs/content.zh/docs/connectors/table/hive/hive_functions.md b/docs/content.zh/docs/connectors/table/hive/hive_functions.md
index c505c7a3050..da540f9399a 100644
--- a/docs/content.zh/docs/connectors/table/hive/hive_functions.md
+++ b/docs/content.zh/docs/connectors/table/hive/hive_functions.md
@@ -61,13 +61,9 @@ version = "2.3.4"
 t_env.load_module(name, HiveModule(version))
 ```
 {{< /tab >}}
-{{< tab "YAML" >}}
-```yaml
-modules:
-   - name: core
-     type: core
-   - name: myhive
-     type: hive
+{{< tab "SQL Client" >}}
+```sql
+LOAD MODULE hive WITH ('hive-version' = '2.3.4');
 ```
 {{< /tab >}}
 {{< /tabs >}}
diff --git a/docs/content.zh/docs/connectors/table/hive/hive_read_write.md b/docs/content.zh/docs/connectors/table/hive/hive_read_write.md
index fb29cf9366f..99d05cbba05 100644
--- a/docs/content.zh/docs/connectors/table/hive/hive_read_write.md
+++ b/docs/content.zh/docs/connectors/table/hive/hive_read_write.md
@@ -507,7 +507,7 @@ INSERT INTO TABLE fact_tz PARTITION (day, hour) select 1, '2022-8-8', '14';
 
 **注意:**
 - 该配置项 `table.exec.hive.sink.sort-by-dynamic-partition.enable` 只在批模式下生效。
-- 目前,只有在 Flink 批模式下使用了 [Hive 方言]({{< ref "docs/connectors/table/hive/hive_dialect" >}}),才可以使用 `DISTRIBUTED BY` 和 `SORTED BY`。
+- 目前,只有在 Flink 批模式下使用了 [Hive 方言]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}}),才可以使用 `DISTRIBUTED BY` 和 `SORTED BY`。
 
 ### 自动收集统计信息
 在使用 Flink 写入 Hive 表的时候,Flink 将默认自动收集写入数据的统计信息然后将其提交至 Hive metastore 中。
diff --git a/docs/content.zh/docs/connectors/table/hive/overview.md b/docs/content.zh/docs/connectors/table/hive/overview.md
index b40967e8945..957cd790fb7 100644
--- a/docs/content.zh/docs/connectors/table/hive/overview.md
+++ b/docs/content.zh/docs/connectors/table/hive/overview.md
@@ -449,7 +449,7 @@ USE CATALOG myhive;
 
 ## DDL
 
-在 Flink 中执行 DDL 操作 Hive 的表、视图、分区、函数等元数据时,建议使用 [Hive 方言]({{< ref "docs/connectors/table/hive/hive_dialect" >}})
+在 Flink 中执行 DDL 操作 Hive 的表、视图、分区、函数等元数据时,建议使用 [Hive 方言]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}})
 
 ## DML
 
diff --git a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md b/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
index 3dee17410da..fe5bb79705b 100644
--- a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
+++ b/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
@@ -1,7 +1,7 @@
 ---
-title: Hive Compatibility
+title: Hive 兼容性
 bookCollapseSection: true
-weight: 94
+weight: 34
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
diff --git a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md b/docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
similarity index 95%
copy from docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
copy to docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
index 3dee17410da..3daf759196f 100644
--- a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
+++ b/docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
@@ -1,7 +1,7 @@
 ---
-title: Hive Compatibility
+title: Hive 方言
 bookCollapseSection: true
-weight: 94
+weight: 35
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
@@ -11,9 +11,7 @@ regarding copyright ownership.  The ASF licenses this file
 to you under the Apache License, Version 2.0 (the
 "License"); you may not use this file except in compliance
 with the License.  You may obtain a copy of the License at
-
   http://www.apache.org/licenses/LICENSE-2.0
-
 Unless required by applicable law or agreed to in writing,
 software distributed under the License is distributed on an
 "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/overview.md b/docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/overview.md
new file mode 100644
index 00000000000..127b74940d1
--- /dev/null
+++ b/docs/content.zh/docs/dev/table/hiveCompatibility/hiveDialect/overview.md
@@ -0,0 +1,93 @@
+---
+title: "概览"
+weight: 1
+type: docs
+aliases:
+- /dev/table/hiveCompatibility/hiveDialect/overview
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+  http://www.apache.org/licenses/LICENSE-2.0
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Hive 方言
+
+从 1.11.0 开始,在使用 Hive 方言时,Flink 允许用户用 Hive 语法来编写 SQL 语句。
+通过提供与 Hive 语法的兼容性,我们旨在改善与 Hive 的互操作性,并减少用户需要在 Flink 和 Hive 之间切换来执行不同语句的情况。
+
+## 使用 Hive 方言
+
+Flink 目前支持两种 SQL 方言: `default` 和 `hive`。你需要先切换到 Hive 方言,然后才能使用 Hive 语法编写。下面介绍如何使用 SQL 客户端和 Table API 设置方言。
+还要注意,你可以为执行的每个语句动态切换方言。无需重新启动会话即可使用其他方言。
+
+{{< hint warning >}}
+**Note:**
+
+- 为了使用 Hive 方言, 你必须首先添加和 Hive 相关的依赖. 请参考 [Hive dependencies]({{< ref "docs/connectors/table/hive/overview" >}}#dependencies) 如何添加这些依赖。
+- 请确保当前的 Catalog 是 [HiveCatalog]({{< ref "docs/connectors/table/hive/hive_catalog" >}}). 否则, 将使用 Flink 的默认方言。
+- 了实现更好的语法和语义的兼容,强烈建议首先加载 [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule) 
+  并将其放在 Module 列表的首位,以便在函数解析时优先使用 Hive 内置函数。 
+  请参考文档 [here]({{< ref "docs/dev/table/modules" >}}#how-to-load-unload-use-and-list-modules) 来将 HiveModule 放在 Module 列表的首.
+- Hive 方言只支持 `db.table` 这种两级的标识符,不支持带有 Catalog 名字的标识符。
+- 虽然所有 Hive 版本支持相同的语法,但是一些特定的功能是否可用仍取决于你使用的[Hive 版本]({{< ref "docs/connectors/table/hive/overview" >}}#支持的hive版本)。例如,更新数据库位置
+  只在 Hive-2.4.0 或更高版本支持。
+  {{< /hint >}}
+
+### SQL Client
+
+SQL 方言可以通过 `table.sql-dialect` 属性指定。你可以在 SQL 客户端启动后设置方言。
+
+```bash
+Flink SQL> SET 'table.sql-dialect' = 'hive'; -- 使用 Hive 方言
+[INFO] Session property has been set.
+
+Flink SQL> SET 'table.sql-dialect' = 'default'; -- 使用 Flink 默认 方言
+[INFO] Session property has been set.
+```
+
+{{< hint warning >}}
+**Note:**
+Since Flink 1.15, when you want to use Hive dialect in Flink SQL client, you have to swap the jar `flink-table-planner-loader` located in `FLINK_HOME/lib`
+with the jar `flink-table-planner_2.12` located in `FLINK_HOME/opt`. Otherwise, it'll throw the following exception:
+{{< /hint >}}
+{{<img alt="error" width="80%" src="/fig/hive_parser_load_exception.png">}}
+
+### Table API
+
+你可以使用 Table API 为 TableEnvironment 设置方言。
+
+{{< tabs "f19e5e09-c58d-424d-999d-275106d1d5b3" >}}
+{{< tab "Java" >}}
+```java
+EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
+TableEnvironment tableEnv = TableEnvironment.create(settings);
+// to use hive dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
+// to use default dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.DEFAULT);
+```
+{{< /tab >}}
+{{< tab "Python" >}}
+```python
+from pyflink.table import *
+settings = EnvironmentSettings.in_batch_mode()
+t_env = TableEnvironment.create(settings)
+# to use hive dialect
+t_env.get_config().set_sql_dialect(SqlDialect.HIVE)
+# to use default dialect
+t_env.get_config().set_sql_dialect(SqlDialect.DEFAULT)
+```
+{{< /tab >}}
+{{< /tabs >}}
diff --git a/docs/content/docs/connectors/table/hive/hive_catalog.md b/docs/content/docs/connectors/table/hive/hive_catalog.md
index 932e18fcc0d..323928055c7 100644
--- a/docs/content/docs/connectors/table/hive/hive_catalog.md
+++ b/docs/content/docs/connectors/table/hive/hive_catalog.md
@@ -64,7 +64,7 @@ Generic tables, on the other hand, are specific to Flink. When creating generic
 HMS to persist the metadata. While these tables are visible to Hive, it's unlikely Hive is able to understand
 the metadata. And therefore using such tables in Hive leads to undefined behavior.
 
-It's recommended to switch to [Hive dialect]({{< ref "docs/connectors/table/hive/hive_dialect" >}}) to create Hive-compatible tables.
+It's recommended to switch to [Hive dialect]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}}) to create Hive-compatible tables.
 If you want to create Hive-compatible tables with default dialect, make sure to set `'connector'='hive'` in your table properties, otherwise
 a table is considered generic by default in `HiveCatalog`. Note that the `connector` property is not required if you use Hive dialect.
 
diff --git a/docs/content/docs/connectors/table/hive/hive_dialect.md b/docs/content/docs/connectors/table/hive/hive_dialect.md
deleted file mode 100644
index f8d2e675cbb..00000000000
--- a/docs/content/docs/connectors/table/hive/hive_dialect.md
+++ /dev/null
@@ -1,434 +0,0 @@
----
-title: "Hive Dialect"
-weight: 3
-type: docs
-aliases:
-  - /dev/table/connectors/hive/hive_dialect.html
----
-<!--
-Licensed to the Apache Software Foundation (ASF) under one
-or more contributor license agreements.  See the NOTICE file
-distributed with this work for additional information
-regarding copyright ownership.  The ASF licenses this file
-to you under the Apache License, Version 2.0 (the
-"License"); you may not use this file except in compliance
-with the License.  You may obtain a copy of the License at
-
-  http://www.apache.org/licenses/LICENSE-2.0
-
-Unless required by applicable law or agreed to in writing,
-software distributed under the License is distributed on an
-"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
-KIND, either express or implied.  See the License for the
-specific language governing permissions and limitations
-under the License.
--->
-
-# Hive Dialect
-
-Flink allows users to write SQL statements in Hive syntax when Hive dialect
-is used. By providing compatibility with Hive syntax, we aim to improve the interoperability with
-Hive and reduce the scenarios when users need to switch between Flink and Hive in order to execute
-different statements.
-
-## Use Hive Dialect
-
-Flink currently supports two SQL dialects: `default` and `hive`. You need to switch to Hive dialect
-before you can write in Hive syntax. The following describes how to set dialect with
-SQL Client and Table API. Also notice that you can dynamically switch dialect for each
-statement you execute. There's no need to restart a session to use a different dialect.
-
-### SQL Client
-
-SQL dialect can be specified via the `table.sql-dialect` property. Therefore you can set the initial dialect to use in
-the `configuration` section of the yaml file for your SQL Client.
-
-```yaml
-
-execution:
-  type: batch
-  result-mode: table
-
-configuration:
-  table.sql-dialect: hive
-
-```
-
-You can also set the dialect after the SQL Client has launched.
-
-```bash
-
-Flink SQL> SET 'table.sql-dialect' = 'hive'; -- to use hive dialect
-[INFO] Session property has been set.
-
-Flink SQL> SET 'table.sql-dialect' = 'default'; -- to use default dialect
-[INFO] Session property has been set.
-
-```
-
-### Table API
-
-You can set dialect for your TableEnvironment with Table API.
-
-{{< tabs "f19e5e09-c58d-424d-999d-275106d1d5b3" >}}
-{{< tab "Java" >}}
-```java
-
-EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
-TableEnvironment tableEnv = TableEnvironment.create(settings);
-// to use hive dialect
-tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
-// to use default dialect
-tableEnv.getConfig().setSqlDialect(SqlDialect.DEFAULT);
-
-```
-{{< /tab >}}
-{{< tab "Python" >}}
-```python
-from pyflink.table import *
-
-settings = EnvironmentSettings.in_batch_mode()
-t_env = TableEnvironment.create(settings)
-
-# to use hive dialect
-t_env.get_config().set_sql_dialect(SqlDialect.HIVE)
-# to use default dialect
-t_env.get_config().set_sql_dialect(SqlDialect.DEFAULT)
-
-```
-{{< /tab >}}
-{{< /tabs >}}
-
-## DDL
-
-This section lists the supported DDLs with the Hive dialect. We'll mainly focus on the syntax
-here. You can refer to [Hive doc](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DDL)
-for the semantics of each DDL statement.
-
-### CATALOG
-
-#### Show
-
-```sql
-SHOW CURRENT CATALOG;
-```
-
-### DATABASE
-
-#### Show
-
-```sql
-SHOW DATABASES;
-SHOW CURRENT DATABASE;
-```
-
-#### Create
-
-```sql
-CREATE (DATABASE|SCHEMA) [IF NOT EXISTS] database_name
-  [COMMENT database_comment]
-  [LOCATION fs_path]
-  [WITH DBPROPERTIES (property_name=property_value, ...)];
-```
-
-#### Alter
-
-##### Update Properties
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET DBPROPERTIES (property_name=property_value, ...);
-```
-
-##### Update Owner
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET OWNER [USER|ROLE] user_or_role;
-```
-
-##### Update Location
-
-```sql
-ALTER (DATABASE|SCHEMA) database_name SET LOCATION fs_path;
-```
-
-#### Drop
-
-```sql
-DROP (DATABASE|SCHEMA) [IF EXISTS] database_name [RESTRICT|CASCADE];
-```
-
-#### Use
-
-```sql
-USE database_name;
-```
-
-### TABLE
-
-#### Show
-
-```sql
-SHOW TABLES;
-```
-
-#### Create
-
-```sql
-CREATE [EXTERNAL] TABLE [IF NOT EXISTS] table_name
-  [(col_name data_type [column_constraint] [COMMENT col_comment], ... [table_constraint])]
-  [COMMENT table_comment]
-  [PARTITIONED BY (col_name data_type [COMMENT col_comment], ...)]
-  [
-    [ROW FORMAT row_format]
-    [STORED AS file_format]
-  ]
-  [LOCATION fs_path]
-  [TBLPROPERTIES (property_name=property_value, ...)]
-
-row_format:
-  : DELIMITED [FIELDS TERMINATED BY char [ESCAPED BY char]] [COLLECTION ITEMS TERMINATED BY char]
-      [MAP KEYS TERMINATED BY char] [LINES TERMINATED BY char]
-      [NULL DEFINED AS char]
-  | SERDE serde_name [WITH SERDEPROPERTIES (property_name=property_value, ...)]
-
-file_format:
-  : SEQUENCEFILE
-  | TEXTFILE
-  | RCFILE
-  | ORC
-  | PARQUET
-  | AVRO
-  | INPUTFORMAT input_format_classname OUTPUTFORMAT output_format_classname
-
-column_constraint:
-  : NOT NULL [[ENABLE|DISABLE] [VALIDATE|NOVALIDATE] [RELY|NORELY]]
-
-table_constraint:
-  : [CONSTRAINT constraint_name] PRIMARY KEY (col_name, ...) [[ENABLE|DISABLE] [VALIDATE|NOVALIDATE] [RELY|NORELY]]
-```
-
-#### Alter
-
-##### Rename
-
-```sql
-ALTER TABLE table_name RENAME TO new_table_name;
-```
-
-##### Update Properties
-
-```sql
-ALTER TABLE table_name SET TBLPROPERTIES (property_name = property_value, property_name = property_value, ... );
-```
-
-##### Update Location
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET LOCATION fs_path;
-```
-
-The `partition_spec`, if present, needs to be a full spec, i.e. has values for all partition columns. And when it's
-present, the operation will be applied to the corresponding partition instead of the table.
-
-##### Update File Format
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET FILEFORMAT file_format;
-```
-
-The `partition_spec`, if present, needs to be a full spec, i.e. has values for all partition columns. And when it's
-present, the operation will be applied to the corresponding partition instead of the table.
-
-##### Update SerDe Properties
-
-```sql
-ALTER TABLE table_name [PARTITION partition_spec] SET SERDE serde_class_name [WITH SERDEPROPERTIES serde_properties];
-
-ALTER TABLE table_name [PARTITION partition_spec] SET SERDEPROPERTIES serde_properties;
-
-serde_properties:
-  : (property_name = property_value, property_name = property_value, ... )
-```
-
-The `partition_spec`, if present, needs to be a full spec, i.e. has values for all partition columns. And when it's
-present, the operation will be applied to the corresponding partition instead of the table.
-
-##### Add Partitions
-
-```sql
-ALTER TABLE table_name ADD [IF NOT EXISTS] (PARTITION partition_spec [LOCATION fs_path])+;
-```
-
-##### Drop Partitions
-
-```sql
-ALTER TABLE table_name DROP [IF EXISTS] PARTITION partition_spec[, PARTITION partition_spec, ...];
-```
-
-##### Add/Replace Columns
-
-```sql
-ALTER TABLE table_name
-  ADD|REPLACE COLUMNS (col_name data_type [COMMENT col_comment], ...)
-  [CASCADE|RESTRICT]
-```
-
-##### Change Column
-
-```sql
-ALTER TABLE table_name CHANGE [COLUMN] col_old_name col_new_name column_type
-  [COMMENT col_comment] [FIRST|AFTER column_name] [CASCADE|RESTRICT];
-```
-
-#### Drop
-
-```sql
-DROP TABLE [IF EXISTS] table_name;
-```
-
-### VIEW
-
-#### Create
-
-```sql
-CREATE VIEW [IF NOT EXISTS] view_name [(column_name, ...) ]
-  [COMMENT view_comment]
-  [TBLPROPERTIES (property_name = property_value, ...)]
-  AS SELECT ...;
-```
-
-#### Alter
-
-##### Rename
-
-```sql
-ALTER VIEW view_name RENAME TO new_view_name;
-```
-
-##### Update Properties
-
-```sql
-ALTER VIEW view_name SET TBLPROPERTIES (property_name = property_value, ... );
-```
-
-##### Update As Select
-
-```sql
-ALTER VIEW view_name AS select_statement;
-```
-
-#### Drop
-
-```sql
-DROP VIEW [IF EXISTS] view_name;
-```
-
-### FUNCTION
-
-#### Show
-
-```sql
-SHOW FUNCTIONS;
-```
-
-#### Create
-
-```sql
-CREATE FUNCTION function_name AS class_name;
-```
-
-#### Drop
-
-```sql
-DROP FUNCTION [IF EXISTS] function_name;
-```
-
-## DML & DQL _`Beta`_
-
-Hive dialect supports a commonly-used subset of Hive's [DML](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+DML)
-and [DQL](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Select). The following lists some examples of
-HiveQL supported by the Hive dialect.
-
-- [SORT/CLUSTER/DISTRIBUTE BY](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+SortBy)
-- [Group By](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+GroupBy)
-- [Join](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Joins)
-- [Union](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Union)
-- [LATERAL VIEW](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+LateralView)
-- [Window Functions](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+WindowingAndAnalytics)
-- [SubQueries](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+SubQueries)
-- [CTE](https://cwiki.apache.org/confluence/display/Hive/Common+Table+Expression)
-- [INSERT INTO dest schema](https://issues.apache.org/jira/browse/HIVE-9481)
-- [Implicit type conversions](https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Types#LanguageManualTypes-AllowedImplicitConversions)
-
-In order to have better syntax and semantic compatibility, it's highly recommended to use [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule)
-and place it first in the module list, so that Hive built-in functions can be picked up during function resolution.
-
-Hive dialect no longer supports [Flink SQL queries]({{< ref "docs/dev/table/sql/queries/overview" >}}). Please switch to `default`
-dialect if you'd like to write in Flink syntax.
-
-Following is an example of using hive dialect to run some queries.
-
-```bash
-Flink SQL> create catalog myhive with ('type' = 'hive', 'hive-conf-dir' = '/opt/hive-conf');
-[INFO] Execute statement succeed.
-
-Flink SQL> use catalog myhive;
-[INFO] Execute statement succeed.
-
-Flink SQL> load module hive;
-[INFO] Execute statement succeed.
-
-Flink SQL> use modules hive,core;
-[INFO] Execute statement succeed.
-
-Flink SQL> set table.sql-dialect=hive;
-[INFO] Session property has been set.
-
-Flink SQL> select explode(array(1,2,3)); -- call hive udtf
-+-----+
-| col |
-+-----+
-|   1 |
-|   2 |
-|   3 |
-+-----+
-3 rows in set
-
-Flink SQL> create table tbl (key int,value string);
-[INFO] Execute statement succeed.
-
-Flink SQL> insert overwrite table tbl values (5,'e'),(1,'a'),(1,'a'),(3,'c'),(2,'b'),(3,'c'),(3,'c'),(4,'d');
-[INFO] Submitting SQL update statement to the cluster...
-[INFO] SQL update statement has been successfully submitted to the cluster:
-
-Flink SQL> select * from tbl cluster by key; -- run cluster by
-2021-04-22 16:13:57,005 INFO  org.apache.hadoop.mapred.FileInputFormat                     [] - Total input paths to process : 1
-+-----+-------+
-| key | value |
-+-----+-------+
-|   1 |     a |
-|   1 |     a |
-|   5 |     e |
-|   2 |     b |
-|   3 |     c |
-|   3 |     c |
-|   3 |     c |
-|   4 |     d |
-+-----+-------+
-8 rows in set
-```
-
-## Notice
-
-The following are some precautions for using the Hive dialect.
-
-- Hive dialect should only be used to process Hive meta objects, and requires the current catalog to be a
-[HiveCatalog]({{< ref "docs/connectors/table/hive/hive_catalog" >}}).
-- Hive dialect only supports 2-part identifiers, so you can't specify catalog for an identifier.
-- While all Hive versions support the same syntax, whether a specific feature is available still depends on the
-[Hive version]({{< ref "docs/connectors/table/hive/overview" >}}#supported-hive-versions) you use. For example, updating database
-location is only supported in Hive-2.4.0 or later.
-- Use [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule)
-to run DML and DQL.
-- Since Flink 1.15 you need to swap flink-table-planner-loader located in /lib with flink-table-planner_2.12 located in /opt to avoid the following exception. Please see [FLINK-25128](https://issues.apache.org/jira/browse/FLINK-25128) for more details.
-  {{<img alt="error" width="80%" src="/fig/hive_parser_load_exception.png">}}
diff --git a/docs/content/docs/connectors/table/hive/hive_functions.md b/docs/content/docs/connectors/table/hive/hive_functions.md
index a9dcfe61557..e57d27f1804 100644
--- a/docs/content/docs/connectors/table/hive/hive_functions.md
+++ b/docs/content/docs/connectors/table/hive/hive_functions.md
@@ -61,13 +61,9 @@ version = "2.3.4"
 t_env.load_module(name, HiveModule(version))
 ```
 {{< /tab >}}
-{{< tab "YAML" >}}
-```yaml
-modules:
-   - name: core
-     type: core
-   - name: myhive
-     type: hive
+{{< tab "SQL Client" >}}
+```sql
+LOAD MODULE hive WITH ('hive-version' = '2.3.4');
 ```
 {{< /tab >}}
 {{< /tabs >}}
diff --git a/docs/content/docs/connectors/table/hive/hive_read_write.md b/docs/content/docs/connectors/table/hive/hive_read_write.md
index 7490e2bcbac..8577c8b8f0a 100644
--- a/docs/content/docs/connectors/table/hive/hive_read_write.md
+++ b/docs/content/docs/connectors/table/hive/hive_read_write.md
@@ -534,7 +534,7 @@ Also, you can manually add `SORTED BY <partition_field>` in your SQL statement t
 
 **NOTE:** 
 - The configuration `table.exec.hive.sink.sort-by-dynamic-partition.enable` only works in Flink `BATCH` mode.
-- Currently, `DISTRIBUTED BY` and `SORTED BY` is only supported when using [Hive dialect]({{< ref "docs/connectors/table/hive/hive_dialect" >}})  in Flink `BATCH` mode.
+- Currently, `DISTRIBUTED BY` and `SORTED BY` is only supported when using [Hive dialect]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}})  in Flink `BATCH` mode.
 
 ### Auto Gather Statistic
 By default, Flink will gather the statistic automatically and then committed to Hive metastore during writing Hive table.
diff --git a/docs/content/docs/connectors/table/hive/overview.md b/docs/content/docs/connectors/table/hive/overview.md
index 687d3afb2eb..6dc9f5117b0 100644
--- a/docs/content/docs/connectors/table/hive/overview.md
+++ b/docs/content/docs/connectors/table/hive/overview.md
@@ -454,7 +454,7 @@ Below are the options supported when creating a `HiveCatalog` instance with YAML
 
 ## DDL
 
-It's recommended to use [Hive dialect]({{< ref "docs/connectors/table/hive/hive_dialect" >}}) to execute DDLs to create
+It's recommended to use [Hive dialect]({{< ref "docs/dev/table/hiveCompatibility/hiveDialect/overview" >}}) to execute DDLs to create
 Hive tables, views, partitions, functions within Flink.
 
 ## DML
diff --git a/docs/content/docs/dev/table/hiveCompatibility/_index.md b/docs/content/docs/dev/table/hiveCompatibility/_index.md
index 3dee17410da..75ce8032c3d 100644
--- a/docs/content/docs/dev/table/hiveCompatibility/_index.md
+++ b/docs/content/docs/dev/table/hiveCompatibility/_index.md
@@ -1,7 +1,7 @@
 ---
 title: Hive Compatibility
 bookCollapseSection: true
-weight: 94
+weight: 34
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
diff --git a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md b/docs/content/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
similarity index 95%
copy from docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
copy to docs/content/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
index 3dee17410da..5eaefdb93b0 100644
--- a/docs/content.zh/docs/dev/table/hiveCompatibility/_index.md
+++ b/docs/content/docs/dev/table/hiveCompatibility/hiveDialect/_index.md
@@ -1,7 +1,7 @@
 ---
-title: Hive Compatibility
+title: Hive Dialect
 bookCollapseSection: true
-weight: 94
+weight: 35
 ---
 <!--
 Licensed to the Apache Software Foundation (ASF) under one
@@ -11,9 +11,7 @@ regarding copyright ownership.  The ASF licenses this file
 to you under the Apache License, Version 2.0 (the
 "License"); you may not use this file except in compliance
 with the License.  You may obtain a copy of the License at
-
   http://www.apache.org/licenses/LICENSE-2.0
-
 Unless required by applicable law or agreed to in writing,
 software distributed under the License is distributed on an
 "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
diff --git a/docs/content/docs/dev/table/hiveCompatibility/hiveDialect/overview.md b/docs/content/docs/dev/table/hiveCompatibility/hiveDialect/overview.md
new file mode 100644
index 00000000000..c652bef92db
--- /dev/null
+++ b/docs/content/docs/dev/table/hiveCompatibility/hiveDialect/overview.md
@@ -0,0 +1,97 @@
+---
+title: "Overview"
+weight: 1
+type: docs
+aliases:
+- /dev/table/hiveCompatibility/hiveDialect/overview
+---
+<!--
+Licensed to the Apache Software Foundation (ASF) under one
+or more contributor license agreements.  See the NOTICE file
+distributed with this work for additional information
+regarding copyright ownership.  The ASF licenses this file
+to you under the Apache License, Version 2.0 (the
+"License"); you may not use this file except in compliance
+with the License.  You may obtain a copy of the License at
+  http://www.apache.org/licenses/LICENSE-2.0
+Unless required by applicable law or agreed to in writing,
+software distributed under the License is distributed on an
+"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+KIND, either express or implied.  See the License for the
+specific language governing permissions and limitations
+under the License.
+-->
+
+# Hive Dialect
+
+Flink allows users to write SQL statements in Hive syntax when Hive dialect is used.
+By providing compatibility with Hive syntax, we aim to improve the interoperability with Hive and reduce the scenarios when users need to switch between Flink and Hive in order to execute different statements.
+
+## Use Hive Dialect
+
+Flink currently supports two SQL dialects: `default` and `hive`. You need to switch to Hive dialect
+before you can write in Hive syntax. The following describes how to set dialect with
+SQL Client and Table API. Also notice that you can dynamically switch dialect for each
+statement you execute. There's no need to restart a session to use a different dialect.
+
+{{< hint warning >}}
+**Note:**
+
+- To use Hive dialect, you have to add dependencies related to Hive. Please refer to [Hive dependencies]({{< ref "docs/connectors/table/hive/overview" >}}#dependencies) for how to add the dependencies.
+- Please make sure the current catalog is [HiveCatalog]({{< ref "docs/connectors/table/hive/hive_catalog" >}}). Otherwise, it will fall back to Flink's `default` dialect.
+- In order to have better syntax and semantic compatibility, it’s highly recommended to load [HiveModule]({{< ref "docs/connectors/table/hive/hive_functions" >}}#use-hive-built-in-functions-via-hivemodule) and
+  place it first in the module list, so that Hive built-in functions can be picked up during function resolution.
+  Please refer [here]({{< ref "docs/dev/table/modules" >}}#how-to-load-unload-use-and-list-modules) for how to change resolution order.
+- Hive dialect only supports 2-part identifiers, so you can't specify catalog for an identifier.
+- While all Hive versions support the same syntax, whether a specific feature is available still depends on the
+  [Hive version]({{< ref "docs/connectors/table/hive/overview" >}}#supported-hive-versions) you use. For example, updating database
+  location is only supported in Hive-2.4.0 or later.
+{{< /hint >}}
+
+### SQL Client
+
+SQL dialect can be specified via the `table.sql-dialect` property.
+Therefore,you can set the dialect after the SQL Client has launched.
+
+```bash
+Flink SQL> SET 'table.sql-dialect' = 'hive'; -- to use hive dialect
+[INFO] Session property has been set.
+
+Flink SQL> SET 'table.sql-dialect' = 'default'; -- to use default dialect
+[INFO] Session property has been set.
+```
+
+{{< hint warning >}}
+**Note:**
+Since Flink 1.15, when you want to use Hive dialect in Flink SQL client, you have to swap the jar `flink-table-planner-loader` located in `FLINK_HOME/lib`
+with the jar `flink-table-planner_2.12` located in `FLINK_HOME/opt`. Otherwise, it'll throw the following exception:
+{{< /hint >}}
+{{<img alt="error" width="80%" src="/fig/hive_parser_load_exception.png">}}
+
+### Table API
+
+You can set dialect for your TableEnvironment with Table API.
+
+{{< tabs "f19e5e09-c58d-424d-999d-275106d1d5b3" >}}
+{{< tab "Java" >}}
+```java
+EnvironmentSettings settings = EnvironmentSettings.inStreamingMode();
+TableEnvironment tableEnv = TableEnvironment.create(settings);
+// to use hive dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.HIVE);
+// to use default dialect
+tableEnv.getConfig().setSqlDialect(SqlDialect.DEFAULT);
+```
+{{< /tab >}}
+{{< tab "Python" >}}
+```python
+from pyflink.table import *
+settings = EnvironmentSettings.in_batch_mode()
+t_env = TableEnvironment.create(settings)
+# to use hive dialect
+t_env.get_config().set_sql_dialect(SqlDialect.HIVE)
+# to use default dialect
+t_env.get_config().set_sql_dialect(SqlDialect.DEFAULT)
+```
+{{< /tab >}}
+{{< /tabs >}}