You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Joseph Lorenzini <JL...@gohealth.com> on 2021/08/06 19:16:30 UTC

Support for authenticated schema registry in debezium registry

Hi all,



I am on flink 1.13.2. I set up create table like so:



CREATE TABLE lead_buffer (

  `id` INT NOT NULL,

  `state` STRING NOT NULL,

   PRIMARY KEY (`id`) NOT ENFORCED

) WITH (

  'connector'= 'kafka',

  'topic' = 'buffer',

  'format'= 'debezium-avro-confluent',

  'debezium-avro-confluent.schema-registry.url'= 'http://localhost:8081',

  'scan.startup.mode'= 'earliest-offset',

  'properties.group.id' = 'customers',

  'debezium-avro-confluent.basic-auth.user-info' = 'sr-user:sr-user-password',

  'properties.bootstrap.servers'= 'localhost:9092',

  'value.fields-include'= 'EXCEPT_KEY',

  'properties.ssl.endpoint.identification.algorithm'= 'http',

  'properties.security.protocol'= 'SASL_PLAINTEXT',

  'properties.sasl.mechanism'= 'PLAIN',

  'properties.sasl.jaas.config' =
'org.apache.kafka.common.security.plain.PlainLoginModule required
username="kafka" password="password";')





I am looking at the docs here:



https://ci.apache.org/projects/flink/flink-docs-
release-1.13/docs/connectors/table/formats/debezium/#debezium-avro-confluent-
basic-auth-user-info



According the properties table, there is a property for setting auth to a
schema registry: debezium-avro-confluent.basic-auth.user-info. However, when I
set this in the DDL for creating the table I get this error:





Caused by: org.apache.flink.table.api.ValidationException: Unsupported options
found for 'kafka'.



Unsupported options:



debezium-avro-confluent.basic-auth.user-info





I found “FLINK-22858: avro-confluent doesn't support confluent schema registry
that has security enabled”. However that ticket was closed as a duplicate of
FLINK-21229, which been resolved and fixed in 1.13.2.



Does anyone know if this has been in fact fixed or whether this is user error
on my part?



Thanks,

Joe

Privileged/Confidential Information may be contained in this message. If you
are not the addressee indicated in this message (or responsible for delivery
of the message to such person), you may not copy or deliver this message to
anyone. In such case, you should destroy this message and kindly notify the
sender by reply email. Please advise immediately if you or your employer does
not consent to Internet email for messages of this kind. Opinions, conclusions
and other information in this message that do not relate to the official
business of my firm shall be understood as neither given nor endorsed by it.


Re: Support for authenticated schema registry in debezium registry

Posted by Ingo Bürk <in...@ververica.com>.
Hi Joe,

there was a follow-up issue, FLINK-23450, which was only fixed for 1.13.3
(not yet released) which I think is what you're seeing?


Best
Ingo

On Fri, Aug 6, 2021, 21:17 Joseph Lorenzini <JL...@gohealth.com> wrote:

> Hi all,
>
>
>
> I am on flink 1.13.2. I set up create table like so:
>
>
>
> CREATE TABLE lead_buffer (
>
>   `id` INT NOT NULL,
>
>   `state` STRING NOT NULL,
>
>    PRIMARY KEY (`id`) NOT ENFORCED
>
> ) WITH (
>
>   'connector'= 'kafka',
>
>   'topic' = 'buffer',
>
>   'format'= 'debezium-avro-confluent',
>
>   'debezium-avro-confluent.schema-registry.url'= 'http://localhost:8081',
>
>   'scan.startup.mode'= 'earliest-offset',
>
>   'properties.group.id' = 'customers',
>
>   'debezium-avro-confluent.basic-auth.user-info' =
> 'sr-user:sr-user-password',
>
>   'properties.bootstrap.servers'= 'localhost:9092',
>
>   'value.fields-include'= 'EXCEPT_KEY',
>
>   'properties.ssl.endpoint.identification.algorithm'= 'http',
>
>   'properties.security.protocol'= 'SASL_PLAINTEXT',
>
>   'properties.sasl.mechanism'= 'PLAIN',
>
>   'properties.sasl.jaas.config' =
> 'org.apache.kafka.common.security.plain.PlainLoginModule required
> username="kafka" password="password";')
>
>
>
>
>
> I am looking at the docs here:
>
>
>
>
> https://ci.apache.org/projects/flink/flink-docs-release-1.13/docs/connectors/table/formats/debezium/#debezium-avro-confluent-basic-auth-user-info
>
>
>
> According the properties table, there is a property for setting auth to a
> schema registry: debezium-avro-confluent.basic-auth.user-info. However,
> when I set this in the DDL for creating the table I get this error:
>
>
>
>
>
> Caused by: org.apache.flink.table.api.ValidationException: Unsupported
> options found for 'kafka'.
>
>
>
> Unsupported options:
>
>
>
> debezium-avro-confluent.basic-auth.user-info
>
>
>
>
>
> I found “FLINK-22858: avro-confluent doesn't support confluent schema
> registry that has security enabled”. However that ticket was closed as a
> duplicate of FLINK-21229, which been resolved and fixed in 1.13.2.
>
>
>
> Does anyone know if this has been in fact fixed or whether this is user
> error on my part?
>
>
>
> Thanks,
>
> Joe
> Privileged/Confidential Information may be contained in this message. If
> you are not the addressee indicated in this message (or responsible for
> delivery of the message to such person), you may not copy or deliver this
> message to anyone. In such case, you should destroy this message and kindly
> notify the sender by reply email. Please advise immediately if you or your
> employer does not consent to Internet email for messages of this kind.
> Opinions, conclusions and other information in this message that do not
> relate to the official business of my firm shall be understood as neither
> given nor endorsed by it.
>