You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@flink.apache.org by ku...@apache.org on 2019/07/29 03:18:06 UTC

[flink] branch release-1.9 updated: [FLINK-13352][docs] Using hive connector with hive-1.2.1 needs libfb303 jar

This is an automated email from the ASF dual-hosted git repository.

kurt pushed a commit to branch release-1.9
in repository https://gitbox.apache.org/repos/asf/flink.git


The following commit(s) were added to refs/heads/release-1.9 by this push:
     new 4788e48  [FLINK-13352][docs] Using hive connector with hive-1.2.1 needs libfb303 jar
4788e48 is described below

commit 4788e4807f725e8c8ffeae3ecf18ac1f5ab8b39c
Author: Rui Li <li...@apache.org>
AuthorDate: Mon Jul 29 11:08:38 2019 +0800

    [FLINK-13352][docs] Using hive connector with hive-1.2.1 needs libfb303 jar
    
    This closes #9223
    
    (cherry picked from commit dbcbc09134e9baa1d8e7d3693afc1402e797dd1e)
---
 docs/dev/table/catalog.md | 21 +++++++++------------
 1 file changed, 9 insertions(+), 12 deletions(-)

diff --git a/docs/dev/table/catalog.md b/docs/dev/table/catalog.md
index 4e720b4..0a4dd3d 100644
--- a/docs/dev/table/catalog.md
+++ b/docs/dev/table/catalog.md
@@ -97,25 +97,17 @@ The ultimate goal for integrating Flink with Hive metadata is that:
 
 2. Meta-objects created by `HiveCatalog` can be written back to Hive metastore such that Hive and other Hive-compatible applications can consume.
 
-## User-configured Catalog
-
-Catalogs are pluggable. Users can develop custom catalogs by implementing the `Catalog` interface, which defines a set of APIs for reading and writing catalog meta-objects such as database, tables, partitions, views, and functions.
-
-
-HiveCatalog
------------
-
-## Supported Hive Versions
+### Supported Hive Versions
 
 Flink's `HiveCatalog` officially supports Hive 2.3.4 and 1.2.1.
 
 The Hive version is explicitly specified as a String, either by passing it to the constructor when creating `HiveCatalog` instances directly in Table API or specifying it in yaml config file in SQL CLI. The Hive version string are `2.3.4` and `1.2.1`.
 
-## Case Insensitive to Meta-Object Names
+### Case Insensitive to Meta-Object Names
 
 Note that Hive Metastore stores meta-object names in lower cases. Thus, unlike `GenericInMemoryCatalog`, `HiveCatalog` is case-insensitive to meta-object names, and users need to be cautious on that.
 
-## Dependencies
+### Dependencies
 
 To use `HiveCatalog`, users need to include the following dependency jars.
 
@@ -140,6 +132,7 @@ For Hive 1.2.1, users need:
 
 - hive-metastore-1.2.1.jar
 - hive-exec-1.2.1.jar
+- libfb303-0.9.3.jar
 
 
 // Hadoop dependencies
@@ -156,7 +149,7 @@ If you don't have Hive dependencies at hand, they can be found at [mvnrepostory.
 Note that users need to make sure the compatibility between their Hive versions and Hadoop versions. Otherwise, there may be potential problem, for example, Hive 2.3.4 is compiled against Hadoop 2.7.2, you may run into problems when using Hive 2.3.4 with Hadoop 2.4.
 
 
-## Data Type Mapping
+### Data Type Mapping
 
 For both Flink and Hive tables, `HiveCatalog` stores table schemas by mapping them to Hive table schemas with Hive data types. Types are dynamically mapped back on read.
 
@@ -196,6 +189,10 @@ The following limitations in Hive's data types impact the mapping between Flink
 
 \** maximum length is 65535
 
+## User-configured Catalog
+
+Catalogs are pluggable. Users can develop custom catalogs by implementing the `Catalog` interface, which defines a set of APIs for reading and writing catalog meta-objects such as database, tables, partitions, views, and functions.
+
 
 Catalog Registration
 --------------------