You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by GitBox <gi...@apache.org> on 2021/05/19 05:49:40 UTC

[GitHub] [hudi] wangxianghu commented on a change in pull request #2963: [HUDI-1904] Move SchemaProvider to hudi-client-common for code reuse …

wangxianghu commented on a change in pull request #2963:
URL: https://github.com/apache/hudi/pull/2963#discussion_r634930255



##########
File path: hudi-utilities/src/main/java/org/apache/hudi/utilities/schema/HoodieSparkSchemaProvider.java
##########
@@ -18,41 +18,27 @@
 
 package org.apache.hudi.utilities.schema;
 
-import org.apache.hudi.ApiMaturityLevel;
-import org.apache.hudi.PublicAPIClass;
-import org.apache.hudi.PublicAPIMethod;
 import org.apache.hudi.common.config.TypedProperties;
+import org.apache.hudi.schema.SchemaProvider;
 
-import org.apache.avro.Schema;
 import org.apache.spark.api.java.JavaSparkContext;
 
-import java.io.Serializable;
-
 /**
- * Class to provide schema for reading data and also writing into a Hoodie table.
+ * SchemaProvider for spark engine.
  */
-@PublicAPIClass(maturity = ApiMaturityLevel.STABLE)
-public abstract class SchemaProvider implements Serializable {
+public abstract class HoodieSparkSchemaProvider extends SchemaProvider {

Review comment:
       > IMO, can we make SchemaProvider spark-free so that we do not need to copy a FilebasedSchemaProvider for Flink. wdyt?
   
   Good idea, I'll give it a try




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org