You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "buptljy (JIRA)" <ji...@apache.org> on 2018/07/27 09:51:00 UTC
[jira] [Comment Edited] (FLINK-9964) Add a CSV table format factory
[ https://issues.apache.org/jira/browse/FLINK-9964?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16559502#comment-16559502 ]
buptljy edited comment on FLINK-9964 at 7/27/18 9:50 AM:
---------------------------------------------------------
[~twalthr] I've tried some tests on the Jackson library, and it looks good except that it cannot support nested Array values like *String[][].class*, but I think we should do what you just said at the first step.
was (Author: wind_ljy):
[~twalthr] I've tried some tests on the Jackson library, and it looks good except that it cannot support nested Array values like *String[][].class*, but I think we should do the same as you said at the first step.
> Add a CSV table format factory
> ------------------------------
>
> Key: FLINK-9964
> URL: https://issues.apache.org/jira/browse/FLINK-9964
> Project: Flink
> Issue Type: Sub-task
> Components: Table API & SQL
> Reporter: Timo Walther
> Assignee: buptljy
> Priority: Major
>
> We should add a RFC 4180 compliant CSV table format factory to read and write data into Kafka and other connectors. This requires a {{SerializationSchemaFactory}} and {{DeserializationSchemaFactory}}. How we want to represent all data types and nested types is still up for discussion. For example, we could flatten and deflatten nested types as it is done [here|http://support.gnip.com/articles/json2csv.html]. We can also have a look how tools such as the Avro to CSV tool perform the conversion.
--
This message was sent by Atlassian JIRA
(v7.6.3#76005)