You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Chesnay Schepler (JIRA)" <ji...@apache.org> on 2016/03/16 17:02:33 UTC
[jira] [Updated] (FLINK-3524) Provide a JSONSerialisationSchema in
the kafka connector package
[ https://issues.apache.org/jira/browse/FLINK-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Chesnay Schepler updated FLINK-3524:
------------------------------------
Description:
(I don't want to include this into 1.0.0)
Currently, there is no standardized way of parsing JSON data from a Kafka stream. I see a lot of users using JSON in their topics. It would make things easier for our users to provide a serializer for them.
I suggest to use the jackson library because we have that aready as a dependency in Flink and it allows to parse from a byte[].
I would suggest to provide the following classes:
- JSONDeserializationSchema()
- JSONDeKeyValueSerializationSchema(bool includeMetadata)
The second variant should produce a record like this:
{code}
{"key": "keydata",
"value": "valuedata",
"metadata": {"offset": 123, "topic": "<topic>", "partition": 2 }
{code}
was:
(I don't want to include this into 1.0.0)
Currently, there is no standardized way of parsing JSON data from a Kafka stream. I see a lot of users using JSON in their topics. It would make things easier for our users to provide a serializer for them.
I suggest to use the jackson library because we have that aready as a dependency in Flink and it allows to parse from a byte[].
I would suggest to provide the following classes:
- JSONSerializationSchema()
- JSONKeyValueSerializationSchema(bool includeMetadata)
The second variant should produce a record like this:
{code}
{"key": "keydata",
"value": "valuedata",
"metadata": {"offset": 123, "topic": "<topic>", "partition": 2 }
{code}
> Provide a JSONSerialisationSchema in the kafka connector package
> ----------------------------------------------------------------
>
> Key: FLINK-3524
> URL: https://issues.apache.org/jira/browse/FLINK-3524
> Project: Flink
> Issue Type: Improvement
> Components: Kafka Connector
> Reporter: Robert Metzger
> Labels: starter
>
> (I don't want to include this into 1.0.0)
> Currently, there is no standardized way of parsing JSON data from a Kafka stream. I see a lot of users using JSON in their topics. It would make things easier for our users to provide a serializer for them.
> I suggest to use the jackson library because we have that aready as a dependency in Flink and it allows to parse from a byte[].
> I would suggest to provide the following classes:
> - JSONDeserializationSchema()
> - JSONDeKeyValueSerializationSchema(bool includeMetadata)
> The second variant should produce a record like this:
> {code}
> {"key": "keydata",
> "value": "valuedata",
> "metadata": {"offset": 123, "topic": "<topic>", "partition": 2 }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)