You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2022/04/25 15:54:21 UTC

[GitHub] [hudi] nsivabalan commented on a diff in pull request #5424: [HUDI-3972] Fixing hoodie.properties/tableConfig for no preCombine field with writes

nsivabalan commented on code in PR #5424:
URL: https://github.com/apache/hudi/pull/5424#discussion_r857784124


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -137,6 +137,9 @@ object HoodieSparkSqlWriter {
       val partitionColumns = HoodieSparkUtils.getPartitionColumns(keyGenerator, toProperties(parameters))
       // Create the table if not present
       if (!tableExists) {
+        val preCombineField = hoodieConfig.getString(PRECOMBINE_FIELD)

Review Comment:
   will remove this line. just a note to reviewer that here the value is already deduced to "ts" if not explicitly set by the user.



##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -151,7 +154,7 @@ object HoodieSparkSqlWriter {
           .setBaseFileFormat(baseFileFormat)
           .setArchiveLogFolder(archiveLogFolder)
           .setPayloadClassName(hoodieConfig.getString(PAYLOAD_CLASS_NAME))
-          .setPreCombineField(hoodieConfig.getStringOrDefault(PRECOMBINE_FIELD, null))
+          .setPreCombineField(optParams.getOrElse(PRECOMBINE_FIELD.key(), null))

Review Comment:
   if not for this fix, tests written in TestMORDatasource fails



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org