You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@iceberg.apache.org by GitBox <gi...@apache.org> on 2020/07/21 17:20:59 UTC

[GitHub] [iceberg] rdblue commented on issue #1215: FlinkParquetWriter should build writer with schema visitor

rdblue commented on issue #1215:
URL: https://github.com/apache/iceberg/issues/1215#issuecomment-661994888


   Thanks for the clarification on the types that Flink uses.
   
   I agree that we should create readers and writers that work directly with these types, instead of trying to make the Iceberg generics work.
   
   > I wonder if we can only implement Flink internal data structure conversions
   
   This is what we do for Spark. We try to only work with `InternalRow`, and have conversions where we need to for testing the public `Row` we get back from some interfaces.
   
   I would prefer to have just one data model implementation for Flink's internal representation (`RowData`). If we need to, we can have one for the external `Row` as well, but if Flink can already convert from `RowData` to `Row`, I would like to try to use those conversions instead.


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
users@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@iceberg.apache.org
For additional commands, e-mail: issues-help@iceberg.apache.org