You are viewing a plain text version of this content. The canonical link for it is here.
Posted to commits@hudi.apache.org by "Ethan Guo (Jira)" <ji...@apache.org> on 2022/04/18 21:53:00 UTC

[jira] [Updated] (HUDI-2375) Create common SchemaProvider and RecordPayloads for spark, flink etc.

     [ https://issues.apache.org/jira/browse/HUDI-2375?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Ethan Guo updated HUDI-2375:
----------------------------
    Fix Version/s: 0.12.0

> Create common SchemaProvider and RecordPayloads for spark, flink etc.
> ---------------------------------------------------------------------
>
>                 Key: HUDI-2375
>                 URL: https://issues.apache.org/jira/browse/HUDI-2375
>             Project: Apache Hudi
>          Issue Type: Improvement
>            Reporter: Rajesh Mahindra
>            Priority: Major
>             Fix For: 0.12.0
>
>
> Create common SchemaProvider and RecordPayloads for spark, flink etc.
> - Currently the class org.apache.hudi.utilities.schema.SchemaProvider takes in input JavaSparkContext, and is specific to Spark Engine. So we have created a separate SchemaProvider for flink. Now for Kafka connect, we can use neither, since its neither spark nor flink. Implement a common class that uses HoodieEngineContext ..



--
This message was sent by Atlassian Jira
(v8.20.1#820001)