You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@flink.apache.org by Fabian Hueske <fh...@gmail.com> on 2018/07/03 08:15:53 UTC

Re: Kafka Avro Table Source

Hi Will,

The community is currently working on improving the Kafka Avro integration
for Flink SQL.
There's a PR [1]. If you like, you could try it out and give some feedback.

Timo (in CC) has been working Kafka Avro and should be able to help with
any specific questions.

Best, Fabian

[1] https://github.com/apache/flink/pull/6218

2018-07-03 3:02 GMT+02:00 Will Du <wi...@gmail.com>:

> Hi folks,
> I am working on using avro table source mapping to kafka source. By
> looking at the example, I think the current Flink v1.5.0 connector is not
> flexible enough. I wonder if I have to specify the avro record class to
> read from Kafka.
>
> For withSchema, the schema can get from schema registry. However, the
> class of avro seems hard coded.
>
> thanks,
> Will
>
> KafkaTableSource source = Kafka010AvroTableSource.builder()
>   // set Kafka topic
>   .forTopic("sensors")
>   // set Kafka consumer properties
>   .withKafkaProperties(kafkaProps)
>   // set Table schema
>   .withSchema(TableSchema.builder()
>     .field("sensorId", Types.LONG())
>     .field("temp", Types.DOUBLE())
>     .field("time", Types.SQL_TIMESTAMP()).build())
>   // set class of Avro record*  .forAvroRecordClass(SensorReading.class)  // ? Any way to get this without hard code class*
>   .build();
>
>
>