You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Pavel Ciorba <pa...@gmail.com> on 2018/03/26 19:56:54 UTC

Table/SQL Kafka Sink Question

Hi everyone!

Can I specify a *message key* using the Kafka sink in the Table/SQL API ?
The goal is to sink each row as JSON along side with a message key into
Kafka.

I was achieving it using the Stream API by specifying a
*KeyedSerializationSchema* using the *serializeKey() *method.

Thanks in advance!

Re: Table/SQL Kafka Sink Question

Posted by Timo Walther <tw...@apache.org>.
Hi Alexandru,

the KafkaTableSink does not expose all features of the underlying 
DataStream API. Either you convert your table program to the DataStream 
API for the sink operation or you just extend a class like 
Kafka010JsonTableSink and customize it.

Regards,
Timo


Am 27.03.18 um 11:59 schrieb Alexandru Gutan:
> That's what I concluded as well after checking the docs and source code.
>
> I'm thinking to add another job using the Stream API (where it is 
> possible), that will ingest the data resulted from by Table/SQL API 
> job, and that will add the message key into Kafka.
>
> On 27 March 2018 at 12:55, Chesnay Schepler <chesnay@apache.org 
> <ma...@apache.org>> wrote:
>
>     Hello,
>
>     as far as i can this is not possible. I'm including Timo, maybe he
>     can explain why this isn't supported.
>
>
>     On 26.03.2018 21:56, Pavel Ciorba wrote:
>>     Hi everyone!
>>
>>     Can I specify a *message key* using the Kafka sink in the
>>     Table/SQL API ?
>>     The goal is to sink each row as JSON along side with a message
>>     key into Kafka.
>>
>>     I was achieving it using the Stream API by specifying a
>>     *KeyedSerializationSchema* using the *serializeKey() *method.
>>
>>     Thanks in advance!
>
>
>


Re: Table/SQL Kafka Sink Question

Posted by Alexandru Gutan <al...@gmail.com>.
That's what I concluded as well after checking the docs and source code.

I'm thinking to add another job using the Stream API (where it is
possible), that will ingest the data resulted from by Table/SQL API job,
and that will add the message key into Kafka.

On 27 March 2018 at 12:55, Chesnay Schepler <ch...@apache.org> wrote:

> Hello,
>
> as far as i can this is not possible. I'm including Timo, maybe he can
> explain why this isn't supported.
>
>
> On 26.03.2018 21:56, Pavel Ciorba wrote:
>
> Hi everyone!
>
> Can I specify a *message key* using the Kafka sink in the Table/SQL API ?
> The goal is to sink each row as JSON along side with a message key into
> Kafka.
>
> I was achieving it using the Stream API by specifying a
> *KeyedSerializationSchema* using the *serializeKey() *method.
>
> Thanks in advance!
>
>
>

Re: Table/SQL Kafka Sink Question

Posted by Chesnay Schepler <ch...@apache.org>.
Hello,

as far as i can this is not possible. I'm including Timo, maybe he can 
explain why this isn't supported.

On 26.03.2018 21:56, Pavel Ciorba wrote:
> Hi everyone!
>
> Can I specify a *message key* using the Kafka sink in the Table/SQL API ?
> The goal is to sink each row as JSON along side with a message key 
> into Kafka.
>
> I was achieving it using the Stream API by specifying a 
> *KeyedSerializationSchema* using the *serializeKey() *method.
>
> Thanks in advance!