You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "Robert Metzger (JIRA)" <ji...@apache.org> on 2016/04/04 15:51:25 UTC
[jira] [Resolved] (FLINK-3524) Provide a JSONDeserialisationSchema
in the kafka connector package
[ https://issues.apache.org/jira/browse/FLINK-3524?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Robert Metzger resolved FLINK-3524.
-----------------------------------
Resolution: Fixed
Fix Version/s: 1.1.0
Resolved in http://git-wip-us.apache.org/repos/asf/flink/commit/c7595840
> Provide a JSONDeserialisationSchema in the kafka connector package
> ------------------------------------------------------------------
>
> Key: FLINK-3524
> URL: https://issues.apache.org/jira/browse/FLINK-3524
> Project: Flink
> Issue Type: Improvement
> Components: Kafka Connector
> Reporter: Robert Metzger
> Assignee: Chesnay Schepler
> Labels: starter
> Fix For: 1.1.0
>
>
> (I don't want to include this into 1.0.0)
> Currently, there is no standardized way of parsing JSON data from a Kafka stream. I see a lot of users using JSON in their topics. It would make things easier for our users to provide a serializer for them.
> I suggest to use the jackson library because we have that aready as a dependency in Flink and it allows to parse from a byte[].
> I would suggest to provide the following classes:
> - JSONDeserializationSchema()
> - JSONDeKeyValueSerializationSchema(bool includeMetadata)
> The second variant should produce a record like this:
> {code}
> {"key": "keydata",
> "value": "valuedata",
> "metadata": {"offset": 123, "topic": "<topic>", "partition": 2 }
> {code}
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)