You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "nsivabalan (via GitHub)" <gi...@apache.org> on 2023/02/24 18:22:25 UTC

[GitHub] [hudi] nsivabalan commented on a diff in pull request #7987: [HUDI-5514] Record Keys Auto-gen Prototype

nsivabalan commented on code in PR #7987:
URL: https://github.com/apache/hudi/pull/7987#discussion_r1117448184


##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/RecordKeyAutoGen.scala:
##########
@@ -0,0 +1,93 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements.  See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership.  The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License.  You may obtain a copy of the License at
+ *
+ *      http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing, software
+ * distributed under the License is distributed on an "AS IS" BASIS,
+ * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
+ * See the License for the specific language governing permissions and
+ * limitations under the License.
+ */
+
+package org.apache.hudi
+
+import org.apache.hudi.common.config.HoodieConfig
+import org.apache.hudi.common.model.{HoodiePayloadProps, HoodieRecord, WriteOperationType}
+import org.apache.hudi.common.table.HoodieTableConfig
+import org.apache.hudi.config.HoodieWriteConfig
+import org.apache.hudi.exception.HoodieException
+import org.apache.hudi.keygen.constant.KeyGeneratorOptions
+import org.apache.spark.sql.{Column, DataFrame}
+import org.apache.spark.sql.catalyst.expressions.AutoRecordKeyGenExpression
+
+import scala.collection.mutable
+import scala.jdk.CollectionConverters.mapAsScalaMapConverter
+
+object RecordKeyAutoGen {
+
+  /**
+   * Set of operations supporting record-key auto-gen (currently only [[WriteOperationType.INSERT]],
+   * [[WriteOperationType.BULK_INSERT]])
+   */
+  private val supportedOperations: Set[String] =
+    Set(WriteOperationType.INSERT, WriteOperationType.BULK_INSERT).map(_.value)
+
+  /**
+   * Set of operations compatible w/ record-key auto-gen (additionally to [[supportedOperations]]
+   * [[WriteOperationType.DELETE]] is a compatible operation)
+   */
+  private val compatibleOperations: Set[String] = supportedOperations ++
+    Set(WriteOperationType.DELETE).map(_.value)
+
+  def tryRecordKeyAutoGen(df: DataFrame, commitInstant: String, config: HoodieConfig): DataFrame = {
+    val shouldAutoGenRecordKeys = config.getBooleanOrDefault(HoodieTableConfig.AUTO_GEN_RECORD_KEYS)
+    val operation = config.getStringOrDefault(DataSourceWriteOptions.OPERATION)
+
+    if (shouldAutoGenRecordKeys && supportedOperations.contains(operation)) {
+      // TODO reorder to keep all meta-fields as first?
+      df.withColumn(HoodieRecord.AUTOGEN_ROW_KEY, new Column(AutoRecordKeyGenExpression(commitInstant)))

Review Comment:
   why do we need to add a new column? can't we keep it in memory(HoodieKey) and add it to our meta field (_hoodie_record_key) only? trying not to change the schema of table irrespective of whether auto gen is enabled or not. 
   



##########
hudi-spark-datasource/hudi-spark-common/src/main/scala/org/apache/hudi/HoodieSparkSqlWriter.scala:
##########
@@ -537,6 +544,21 @@ object HoodieSparkSqlWriter {
     fullPartitions.distinct
   }
 
+  def handleRecordKeyAutoGen(df: DataFrame, commitInstant: String, config: HoodieConfig): DataFrame = {
+    if (config.getBooleanOrDefault(HoodieTableConfig.AUTO_GEN_RECORD_KEYS)) {
+      val monotonicIdFormat = "#" * 19
+      val rowKeyExpr = Concat(Seq(
+        Literal(s"${commitInstant}_"),
+        FormatNumber(MonotonicallyIncreasingID(), Literal(monotonicIdFormat))
+      ))
+
+      // TODO reorder?
+      df.withColumn(HoodieRecord.AUTOGEN_ROW_KEY, new Column(rowKeyExpr))

Review Comment:
   I don't think we can add a new data field. We already have a meta field for holding the record key (_hoodie_record_key). we should try to hold the auto genrated record key in memory (HoodieKey.recordkey) and let the writer write it to meta fields within write Handle. 
   Or we should drop the newly added data field (HoodieRecord.AUTOGEN_ROW_KEY) later. 
   
   



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscribe@hudi.apache.org

For queries about this service, please contact Infrastructure at:
users@infra.apache.org