You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user-zh@flink.apache.org by Summer <bi...@paat.com> on 2022/08/12 07:53:01 UTC

FlinkSqlClient创建hudi表时,未在Hive中同步创建rt、ro表


版本:Flink1.13.3 、Hudi0.10.1、Hive3.1.2


lib:
-rw-r--r-- 1 root root     92313 Nov 12  2021 flink-csv-1.13.3.jar
-rw-r--r-- 1 root root 106535831 Nov 12  2021 flink-dist_2.12-1.13.3.jar
-rw-r--r-- 1 root root    148127 Nov 12  2021 flink-json-1.13.3.jar
-rw-r--r-- 1 root root  43317025 Jun 13 11:49 flink-shaded-hadoop-2-uber-2.8.3-10.0.jar
-rwxr-xr-x 1 root root   7709740 Nov 12  2021 flink-shaded-zookeeper-3.4.14.jar
-rw-r--r-- 1 root root  48816978 Aug 12 13:53 flink-sql-connector-hive-3.1.2_2.11-1.13-SNAPSHOT.jar
-rw-r--r-- 1 root root  35051553 Nov 12  2021 flink-table_2.12-1.13.3.jar
-rw-r--r-- 1 root root  38613339 Nov 12  2021 flink-table-blink_2.12-1.13.3.jar
-rw-r--r-- 1 root root    805845 Aug 12 14:56 hadoop-mapreduce-client-common-3.2.1.jar
-rw-r--r-- 1 root root   1657002 Aug 12 14:56 hadoop-mapreduce-client-core-3.2.1.jar
-rw-r--r-- 1 root root     85900 Aug 12 14:56 hadoop-mapreduce-client-jobclient-3.2.1.jar
-rw-r--r-- 1 root root  38955252 Aug 12 10:55 hudi-flink-bundle_2.11-0.10.1-rc1.jar
-rwxr-xr-x 1 root root     67114 Nov 12  2021 log4j-1.2-api-2.12.1.jar
-rwxr-xr-x 1 root root    276771 Nov 12  2021 log4j-api-2.12.1.jar
-rwxr-xr-x 1 root root   1674433 Nov 12  2021 log4j-core-2.12.1.jar
-rwxr-xr-x 1 root root     23518 Nov 12  2021 log4j-slf4j-impl-2.12.1.jar


建表脚本:

CREATE TABLE paat_hudi_flink_test
(
    id          bigint,
    name        string,
    birthday    TIMESTAMP(3),
    ts          TIMESTAMP(3),
    `partition` VARCHAR(20),
    primary key (id) not enforced --必須指定uuid 主鍵
)
    PARTITIONED BY (`partition`)
with (
    'connector'='hudi',
    'path' = 'hdfs://emr-cluster/tmp/hudi/warehouse/paat_ods_hudi.db/'
    , 'hoodie.datasource.write.recordkey.field' = 'id'
    , 'write.precombine.field' = 'ts'
    , 'write.tasks' = '1'
    , 'compaction.tasks' = '1'
    , 'write.rate.limit' = '2000'
    , 'table.type' = 'MERGE_ON_READ'
    , 'compaction.async.enable' = 'true'
    , 'compaction.trigger.strategy' = 'num_commits'
    , 'compaction.max_memory' = '1024'
    , 'changelog.enable' = 'true'
    , 'read.streaming.enable' = 'true'
    , 'read.streaming.check-interval' = '4'
    , 'hive_sync.enable' = 'true'
    , 'hive_sync.mode'= 'hms'
    , 'hive_sync.metastore.uris' = 'thrift://172.**.12.122:9083'
    , 'hive_sync.jdbc_url' = 'jdbc:hive2://172.**.12.122:10000'
    , 'hive_sync.table' = 'paat_hudi_flink_test'
    , 'hive_sync.db' = 'paat_ods_hudi'
    , 'hive_sync.username' = '***'
    , 'hive_sync.password' = '**'
    , 'hive_sync.support_timestamp' = 'true'
);
启动的时候,未见任何异常。


但是,我记得创建表的时候,理论上应该会出现关于hive-sync相关的日志,但是现在并没有出现,感觉是根本没有将建表信息同步给Hive。


不知道哪里操作有问题,谢谢!