You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/10/25 07:01:48 UTC

[GitHub] [hudi] danny0405 commented on a diff in pull request #7056: [HUDI-5088]Fix bug:Failed to synchronize the hive metadata of the Flink table

danny0405 commented on code in PR #7056:
URL: https://github.com/apache/hudi/pull/7056#discussion_r1004079499


##########
hudi-flink-datasource/hudi-flink/src/main/java/org/apache/hudi/table/catalog/HiveSchemaUtils.java:
##########
@@ -177,10 +180,19 @@ private static DataType toFlinkPrimitiveType(PrimitiveTypeInfo hiveType) {
 
   /**
    * Create Hive field schemas from Flink table schema including the hoodie metadata fields.
+   *
+   * @param table
    */
-  public static List<FieldSchema> toHiveFieldSchema(TableSchema schema) {
+  public static List<FieldSchema> toHiveFieldSchema(CatalogBaseTable table) {
+    TableSchema schema = table.getSchema();
+    Configuration configuration = Configuration.fromMap(table.getOptions());
+    Boolean changelogEnable = configuration.getBoolean(FlinkOptions.CHANGELOG_ENABLED);
+    Collection<String> hoodieMetaColumns = HoodieRecord.HOODIE_META_COLUMNS;
+    if (changelogEnable) {
+      hoodieMetaColumns = HoodieRecord.HOODIE_META_COLUMNS_WITH_OPERATION;

Review Comment:
   In current master, we do not add _hoodie_operation field for hive table now, how the hive table was created locally ?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org