You are viewing a plain text version of this content. The canonical link for it is here.
Posted to jira@kafka.apache.org by "Guy Pascarella (Jira)" <ji...@apache.org> on 2022/01/27 12:08:00 UTC

[jira] [Updated] (KAFKA-13613) Kafka Connect has a hard dependency on KeyGenerator.HmacSHA256

     [ https://issues.apache.org/jira/browse/KAFKA-13613?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Guy Pascarella updated KAFKA-13613:
-----------------------------------
    Description: 
If a server is running Java 8 that has been configured for FIPS mode according to [openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf|https://access.redhat.com/documentation/en-us/openjdk/8/pdf/configuring_openjdk_8_on_rhel_with_fips/openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf] then the SunJCE provider is not available. As such the KeyGenerator HmacSHA256 is not available. The KeyGenerators I see available are

 * DES
 * ARCFOUR
 * AES
 * DESede

Out of these I think AES would be most appropriate, but that's not the point of this issue, just including for completeness.

When Kafka Connect is started in distributed mode on one of these servers I see the following stack trace

{noformat}
[2022-01-20 20:36:30,027] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectDistributed)
java.lang.ExceptionInInitializerError
        at org.apache.kafka.connect.cli.ConnectDistributed.startConnect(ConnectDistributed.java:94)
        at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:79)
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value HmacSHA256 for configuration inter.worker.key.generation.algorithm: HmacSHA256 KeyGenerator not available
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.validateKeyAlgorithm(DistributedConfig.java:504)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.lambda$configDef$2(DistributedConfig.java:375)
        at org.apache.kafka.common.config.ConfigDef$LambdaValidator.ensureValid(ConfigDef.java:1043)
        at org.apache.kafka.common.config.ConfigDef$ConfigKey.<init>(ConfigDef.java:1164)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:152)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:172)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:211)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:373)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.configDef(DistributedConfig.java:371)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.<clinit>(DistributedConfig.java:196)
        ... 2 more
{noformat}

It appears the {{org.apache.kafka.connect.runtime.distributed.DistributedConfig}} is triggering a validation of the hard-coded default {{inter.worker.key.generation.algorithm}} property, which is {{HmacSHA256}}.

Ideally a fix would use the value from the configuration file before attempting to validate a default value.

Updates [2022/01/27]: I just tested on a FIPS-enabled version of OpenJDK 11 using the instructions at [configuring_openjdk_11_on_rhel_with_fips|https://access.redhat.com/documentation/en-us/openjdk/11/html-single/configuring_openjdk_11_on_rhel_with_fips/index], which resulted in the same issues. One workaround is to disable FIPS for Kafka Connect by passing in the JVM parameter {{-Dcom.redhat.fips=false}}, however, that means Kafka Connect and all the workers are out of compliance for anyone required to use FIPS-enabled systems.

  was:
If a server is running Java 8 that has been configured for FIPS mode according to[openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf|https://access.redhat.com/documentation/en-us/openjdk/8/pdf/configuring_openjdk_8_on_rhel_with_fips/openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf] then the SunJCE provider is not available. As such the KeyGenerator HmacSHA256 is not available. The KeyGenerators I see available are
 * DES
 * ARCFOUR
 * AES
 * DESede

 

Out of these I think AES would be most appropriate, but that's not the point of this issue, just including for completeness.

When Kafka Connect is started in distributed mode on one of these servers I see the following stack trace


{noformat}
[2022-01-20 20:36:30,027] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectDistributed)
java.lang.ExceptionInInitializerError
        at org.apache.kafka.connect.cli.ConnectDistributed.startConnect(ConnectDistributed.java:94)
        at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:79)
Caused by: org.apache.kafka.common.config.ConfigException: Invalid value HmacSHA256 for configuration inter.worker.key.generation.algorithm: HmacSHA256 KeyGenerator not available
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.validateKeyAlgorithm(DistributedConfig.java:504)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.lambda$configDef$2(DistributedConfig.java:375)
        at org.apache.kafka.common.config.ConfigDef$LambdaValidator.ensureValid(ConfigDef.java:1043)
        at org.apache.kafka.common.config.ConfigDef$ConfigKey.<init>(ConfigDef.java:1164)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:152)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:172)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:211)
        at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:373)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.configDef(DistributedConfig.java:371)
        at org.apache.kafka.connect.runtime.distributed.DistributedConfig.<clinit>(DistributedConfig.java:196)
        ... 2 more
{noformat}

It appears the {{org.apache.kafka.connect.runtime.distributed.DistributedConfig}} is triggering a validation of the hard-coded default {{inter.worker.key.generation.algorithm}} property, which is {{HmacSHA256}}.

Ideally a fix would use the value from the configuration file before attempting to validate a default value.


> Kafka Connect has a hard dependency on KeyGenerator.HmacSHA256
> --------------------------------------------------------------
>
>                 Key: KAFKA-13613
>                 URL: https://issues.apache.org/jira/browse/KAFKA-13613
>             Project: Kafka
>          Issue Type: Bug
>          Components: KafkaConnect
>    Affects Versions: 3.0.0
>         Environment: RHEL 8.5
> OpenJDK 1.8.0_312
> Confluent Platform 7.0.1 (Kafka 3.0.0)
>            Reporter: Guy Pascarella
>            Priority: Major
>
> If a server is running Java 8 that has been configured for FIPS mode according to [openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf|https://access.redhat.com/documentation/en-us/openjdk/8/pdf/configuring_openjdk_8_on_rhel_with_fips/openjdk-8-configuring_openjdk_8_on_rhel_with_fips-en-us.pdf] then the SunJCE provider is not available. As such the KeyGenerator HmacSHA256 is not available. The KeyGenerators I see available are
>  * DES
>  * ARCFOUR
>  * AES
>  * DESede
> Out of these I think AES would be most appropriate, but that's not the point of this issue, just including for completeness.
> When Kafka Connect is started in distributed mode on one of these servers I see the following stack trace
> {noformat}
> [2022-01-20 20:36:30,027] ERROR Stopping due to error (org.apache.kafka.connect.cli.ConnectDistributed)
> java.lang.ExceptionInInitializerError
>         at org.apache.kafka.connect.cli.ConnectDistributed.startConnect(ConnectDistributed.java:94)
>         at org.apache.kafka.connect.cli.ConnectDistributed.main(ConnectDistributed.java:79)
> Caused by: org.apache.kafka.common.config.ConfigException: Invalid value HmacSHA256 for configuration inter.worker.key.generation.algorithm: HmacSHA256 KeyGenerator not available
>         at org.apache.kafka.connect.runtime.distributed.DistributedConfig.validateKeyAlgorithm(DistributedConfig.java:504)
>         at org.apache.kafka.connect.runtime.distributed.DistributedConfig.lambda$configDef$2(DistributedConfig.java:375)
>         at org.apache.kafka.common.config.ConfigDef$LambdaValidator.ensureValid(ConfigDef.java:1043)
>         at org.apache.kafka.common.config.ConfigDef$ConfigKey.<init>(ConfigDef.java:1164)
>         at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:152)
>         at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:172)
>         at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:211)
>         at org.apache.kafka.common.config.ConfigDef.define(ConfigDef.java:373)
>         at org.apache.kafka.connect.runtime.distributed.DistributedConfig.configDef(DistributedConfig.java:371)
>         at org.apache.kafka.connect.runtime.distributed.DistributedConfig.<clinit>(DistributedConfig.java:196)
>         ... 2 more
> {noformat}
> It appears the {{org.apache.kafka.connect.runtime.distributed.DistributedConfig}} is triggering a validation of the hard-coded default {{inter.worker.key.generation.algorithm}} property, which is {{HmacSHA256}}.
> Ideally a fix would use the value from the configuration file before attempting to validate a default value.
> Updates [2022/01/27]: I just tested on a FIPS-enabled version of OpenJDK 11 using the instructions at [configuring_openjdk_11_on_rhel_with_fips|https://access.redhat.com/documentation/en-us/openjdk/11/html-single/configuring_openjdk_11_on_rhel_with_fips/index], which resulted in the same issues. One workaround is to disable FIPS for Kafka Connect by passing in the JVM parameter {{-Dcom.redhat.fips=false}}, however, that means Kafka Connect and all the workers are out of compliance for anyone required to use FIPS-enabled systems.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)