You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@flink.apache.org by Dawid Wysakowicz <dw...@apache.org> on 2021/02/01 14:47:49 UTC

Re: Connect to schema registry via SSL

Hi,

I am afraid passing of these options is not supported in SQL yet. I
created FLINK-21229 to add support for it.

In a regular job you can construct a schema registry client manually:

        RegistryAvroDeserializationSchema<GenericRecord>
deserializationSchema = new RegistryAvroDeserializationSchema<>(
                GenericRecord.class, // or a SpecificRecord and null schema
                schema,
                () -> new ConfluentSchemaRegistryCoder(
                        new CachedSchemaRegistryClient(/* configure with
ssl */)
                )
        );

Best,

Dawid

[1] https://issues.apache.org/jira/browse/FLINK-21229

On 28/01/2021 17:39, Laurent Exsteens wrote:
> Hello,
>
> I'm trying to us Flink SQL (on Ververica Platform, so no other options
> than pure Flink SQL) to read confluent avro messages from Kafka, when
> the schema registry secured via SSL.
>
> Would you know what are the correct properties to setup in the kafka
> consumer config?
>
> The following options work for a simple java kafka producer/consumer
> (not a Flink job):
> - schema.registry.ssl.truststore.location
> - schema.registry.ssl.truststore.password
> - schema.registry.ssl.keystore.location
> - schema.registry.ssl.keystore.password
>
> However, they don't seem to be taken into account in my query (and
> also not when I tried in a Flink job), even when prefixed by
> 'properties.'.
>
> I'm using Flink 1.11 for the SQL query (Ververica Platform 2.3), and
> Flink 1.10 on my job.
>
> Would you have an idea how can I tell my Flink SQL Kafka Connector how
> to connect to that SR via SSL? Or a normal Flink job?
>
> Thanks in advance for your help.
>
> Best Regards,
>
> Laurent.
>
>
> -- 
> *Laurent Exsteens*
> Data Engineer
> (M) +32 (0) 486 20 48 36
>
> *EURA NOVA*
>
> Rue Emile Francqui, 4
>
> 1435 Mont-Saint-Guibert
>
> (T) +32 10 75 02 00 <tel:%2B32%2010%2075%2002%2000>
>
> *euranova.eu <http://euranova.eu/>*
>
> *research.euranova.eu* <http://research.euranova.eu/>
>
>
> ♻ Be green, keep it on the screen 

Re: Connect to schema registry via SSL

Posted by Laurent Exsteens <la...@euranova.eu>.
Hi Dawid,

Thank you for your answer. I suspected the same and was about to try to
follow the calls to verify it. You save me some time.

I guess it might be early to tell, but if you would know in which flink and
Ververica platform version this would become available, and when, I'm off
course interested.
For now I'll see with my client if there's any possibility to have at least
read access to the schema registry without ssl. At least temporarily.

Best regards,

Laurent.



On Mon, Feb 1, 2021, 15:47 Dawid Wysakowicz <dw...@apache.org> wrote:

> Hi,
>
> I am afraid passing of these options is not supported in SQL yet. I
> created FLINK-21229 to add support for it.
>
> In a regular job you can construct a schema registry client manually:
>
>         RegistryAvroDeserializationSchema<GenericRecord>
> deserializationSchema = new RegistryAvroDeserializationSchema<>(
>                 GenericRecord.class, // or a SpecificRecord and null schema
>                 schema,
>                 () -> new ConfluentSchemaRegistryCoder(
>                         new CachedSchemaRegistryClient(/* configure with
> ssl */)
>                 )
>         );
>
> Best,
>
> Dawid
>
> [1] https://issues.apache.org/jira/browse/FLINK-21229
> On 28/01/2021 17:39, Laurent Exsteens wrote:
>
> Hello,
>
> I'm trying to us Flink SQL (on Ververica Platform, so no other options
> than pure Flink SQL) to read confluent avro messages from Kafka, when the
> schema registry secured via SSL.
>
> Would you know what are the correct properties to setup in the kafka
> consumer config?
>
> The following options work for a simple java kafka producer/consumer (not
> a Flink job):
> - schema.registry.ssl.truststore.location
> - schema.registry.ssl.truststore.password
> - schema.registry.ssl.keystore.location
> - schema.registry.ssl.keystore.password
>
> However, they don't seem to be taken into account in my query (and also
> not when I tried in a Flink job), even when prefixed by 'properties.'.
>
> I'm using Flink 1.11 for the SQL query (Ververica Platform 2.3), and Flink
> 1.10 on my job.
>
> Would you have an idea how can I tell my Flink SQL Kafka Connector how to
> connect to that SR via SSL? Or a normal Flink job?
>
> Thanks in advance for your help.
>
> Best Regards,
>
> Laurent.
>
>
> --
> *Laurent Exsteens*
> Data Engineer
> (M) +32 (0) 486 20 48 36
>
> *EURA NOVA*
>
> Rue Emile Francqui, 4
>
> 1435 Mont-Saint-Guibert
>
> (T) +32 10 75 02 00 <%2B32%2010%2075%2002%2000>
>
> *euranova.eu <http://euranova.eu/>*
>
> *research.euranova.eu* <http://research.euranova.eu/>
>
> ♻ Be green, keep it on the screen
>
>

-- 
♻ Be green, keep it on the screen