You are viewing a plain text version of this content. The canonical link for it is here.
Posted to users@kafka.apache.org by Ivan Janes <iv...@oryxgaming.com> on 2019/06/05 16:10:45 UTC

Re: SASL + SSL : authentication error in broker-to-broker communication

Hi,

maybe the problem is zookeeper.connect setting:

zookeeper.connect=zoo:2181/kafka

Which means that you have to use /kafka path with kafka-configs.sh:

bin/kafka-configs.sh --zookeeper localhost:2181/kafka --alter --add-config 'SCRAM-SHA-256=[iterations=4096,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]' --entity-type users --entity-name alice> 

Change  --zookeeper localhost:2181 to  --zookeeper localhost:2181/kafka

Without /kafka prefix users are created in wrong path and kafka broker does not find users ...

I have needed about 2 weeks to figure it out :)

Regards,
Ivan


On 2019/05/15 12:28:03, Martin Gainty <m....@hotmail.com> wrote: 
> assuming ScramSaslProvider/ScramSaslServer your credentials are stored in ZK /config/users/<encoded-user>> 
> but you cannot see plain-text attributes in ZK so use kafka tool to view> 
> kafka-configs.sh -describe /config/users/<encoded-user>> 
> 
> /*2019 update for kafka-configs.sh */> 
> 
> For ease of use, kafka-configs.sh will take a password and an optional iteration count and generate a random salt, ServerKey and StoredKey as specified in in RFC 5802<https://tools.ietf.org/html/rfc5802>. For example:> 
> 
> bin/kafka-configs.sh --zookeeper localhost:2181 --alter --add-config 'SCRAM-SHA-256=[iterations=4096,password=alice-secret],SCRAM-SHA-512=[password=alice-secret]' --entity-type users --entity-name alice> 
> 
> /*once you have verified username,password from  ZK credentials */> 
> you can now export your cert from /opt/kafka/ssl/kafka.server.keystore.jks> 
> keytool -exportcert -alias admin -keystore /opt/kafka/ssl/kafka.server.keystore.jks -keypass xxxx -storepass xxxx -file admin.cert> 
> 
> (note the storepass is for truststore!)> 
> 
> if you can view the admin.cert with cert-viewer and validate username(subject) are consistent with ZK creds> 
> if you dont have cert-viewer you can convert to pem> 
>  openssl pkcs12 -export -in "admin.p12" -out "admin.pem"> 
> check UID in either cert or pem is consistent with ZK> 
> 
> finally check zk credentials are propagated to jaas.conf> 
> 
> #used by interbroker connections> 
> KafkaServer {> 
>     org.apache.kafka.common.security.scram.ScramLoginModule required> 
>     username="admin"> 
>     password="xxxx";> 
> }> 
> 
> if there is consistency for all entities in> 
> username> 
> password> 
> then your kafka-broker(s) *should* authenticate (assuming they all reference the same ZK server!)> 
> 
> bon chance> 
> https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration> 
> KIP-84: Support SASL SCRAM mechanisms - Apache Kafka - Apache Software Foundation - Dashboard<https://cwiki.apache.org/confluence/display/KAFKA/KIP-84%3A+Support+SASL+SCRAM+mechanisms#KIP-84:SupportSASLSCRAMmechanisms-JAASconfiguration>> 
> This Confluence has been LDAP enabled, if you are an ASF Committer, please use your LDAP Credentials to login. Any problems file an INFRA jira ticket please.> 
> cwiki.apache.org> 
> 
> 
> 
> 
> ________________________________> 
> From: Kieran JOYEUX <kj...@splio.com>> 
> Sent: Wednesday, May 15, 2019 4:42 AM> 
> To: users@kafka.apache.org> 
> Subject: SASL + SSL : authentication error in broker-to-broker communication> 
> 
> Hello,> 
> 
> I'm facing trouble activating SASL on my currrent  working SSL only cluster. I have read the doc many times and my configuration seems to be good. However, It's like Kafka cannot authenticate and broker to broker communication is not working at all.> 
> 
> Do you have any ideas ? (Descriptions below)> 
> 
> Thanks a lot.> 
> 
> Kieran> 
> 
> --------------------------------------------> 
> 
> # Versions> 
> Kafka: 2.2.0> 
> Zookeeper: 3.4.9-3+deb9u1> 
> 
> # Error message> 
> [2019-05-15 10:14:00,811] DEBUG Set SASL server state to HANDSHAKE_OR_VERSIONS_REQUEST during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,811] DEBUG Handling Kafka request API_VERSIONS during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,811] DEBUG Set SASL server state to HANDSHAKE_REQUEST during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,812] DEBUG Handling Kafka request SASL_HANDSHAKE during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,812] DEBUG Using SASL mechanism 'SCRAM-SHA-512' provided by client (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,813] DEBUG Setting SASL/SCRAM_SHA_512 server state to RECEIVE_CLIENT_FIRST_MESSAGE (org.apache.kafka.common.security.scram.internals.ScramSaslServer)> 
> [2019-05-15 10:14:00,813] DEBUG Set SASL server state to AUTHENTICATE during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,814] DEBUG Setting SASL/SCRAM_SHA_512 server state to FAILED (org.apache.kafka.common.security.scram.internals.ScramSaslServer)> 
> [2019-05-15 10:14:00,814] DEBUG Set SASL server state to FAILED during authentication (org.apache.kafka.common.security.authenticator.SaslServerAuthenticator)> 
> [2019-05-15 10:14:00,814] INFO [SocketServer brokerId=2] Failed authentication with 10.101.60.15 (Authentication failed during authentication due to invalid credentials with SASL mechanism SCRAM-SHA-512) (org.apache.kafka.common.network.Selector)> 
> [2019-05-15 10:14:00,815] DEBUG [SocketServer brokerId=2] Connection with 10.101.60.15 disconnected (org.apache.kafka.common.network.Selector)> 
> java.io.EOFException> 
> at org.apache.kafka.common.network.SslTransportLayer.read(SslTransportLayer.java:573)> 
> at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:94)> 
> at org.apache.kafka.common.security.authenticator.SaslServerAuthenticator.authenticate(SaslServerAuthenticator.java:267)> 
> at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:173)> 
> at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:536)> 
> at org.apache.kafka.common.network.Selector.poll(Selector.java:472)> 
> at kafka.network.Processor.poll(SocketServer.scala:830)> 
> at kafka.network.Processor.run(SocketServer.scala:730)> 
> at java.lang.Thread.run(Thread.java:748)> 
> 
> 
> # User creation in ZK & output> 
> /opt/kafka/bin/kafka-configs.sh --zookeeper xxxx:2181 --alter --add-config 'SCRAM-SHA-512=[password=xxxx]' --entity-type users --entity-name admin> 
> entity-name admin> 
> Configs for user-principal 'admin' are SCRAM-SHA-512=salt=bnBicjI4NWd5dDBweGJoMmJ1bnlzdzFxYQ==,stored_key=xxxxx,server_key=xxxxxx==,iterations=4096> 
> 
> 
> # ps fauxww> 
> kafka     2523  7.1 15.9 5838668 972848 ?      Ssl  mai14  52:46 java -Xmx1G -Xms1G -server -XX:+UseG1GC -XX:MaxGCPauseMillis=20 -XX:InitiatingHeapOccupancyPercent=35 -XX:+ExplicitGCInvokesConcurrent -Djava.awt.headless=true -Xloggc:/var/log/kafka/kafkaServer-gc.log -verbose:gc -XX:+PrintGCDetails -XX:+PrintGCDateStamps -XX:+PrintGCTimeStamps -XX:+UseGCLogFileRotation -XX:NumberOfGCLogFiles=10 -XX:GCLogFileSize=100M -Dcom.sun.management.jmxremote -Dcom.sun.management.jmxremote.authenticate=false -Dcom.sun.management.jmxremote.ssl=false -Dcom.sun.management.jmxremote.port=9990 -Djava.security.auth.login.config=/opt/kafka/config/kafka_server_jaas.conf -Djava.rmi.server.hostname=xxxxx -Dkafka.logs.dir=/var/log/kafka -Dlog4j.configuration=file:/opt/kafka/config/log4j.properties -cp /opt/kafka/bin/../libs/activation-1.1.1.jar:/opt/kafka/bin/../libs/aopalliance-repackaged-2.5.0-b42.jar:/opt/kafka/bin/../libs/argparse4j-0.7.0.jar:/opt/kafka/bin/../libs/audience-annotations-0.5.0.jar:/opt/kafka/bin/../libs/commons-lang3-3.8.1.jar:/opt/kafka/bin/../libs/connect-api-2.2.0.jar:/opt/kafka/bin/../libs/connect-basic-auth-extension-2.2.0.jar:/opt/kafka/bin/../libs/connect-file-2.2.0.jar:/opt/kafka/bin/../libs/connect-json-2.2.0.jar:/opt/kafka/bin/../libs/connect-runtime-2.2.0.jar:/opt/kafka/bin/../libs/connect-transforms-2.2.0.jar:/opt/kafka/bin/../libs/guava-20.0.jar:/opt/kafka/bin/../libs/hk2-api-2.5.0-b42.jar:/opt/kafka/bin/../libs/hk2-locator-2.5.0-b42.jar:/opt/kafka/bin/../libs/hk2-utils-2.5.0-b42.jar:/opt/kafka/bin/../libs/jackson-annotations-2.9.8.jar:/opt/kafka/bin/../libs/jackson-core-2.9.8.jar:/opt/kafka/bin/../libs/jackson-databind-2.9.8.jar:/opt/kafka/bin/../libs/jackson-datatype-jdk8-2.9.8.jar:/opt/kafka/bin/../libs/jackson-jaxrs-base-2.9.8.jar:/opt/kafka/bin/../libs/jackson-jaxrs-json-provider-2.9.8.jar:/opt/kafka/bin/../libs/jackson-module-jaxb-annotations-2.9.8.jar:/opt/kafka/bin/../libs/javassist-3.22.0-CR2.jar:/opt/kafka/bin/../libs/javax.annotation-api-1.2.jar:/opt/kafka/bin/../libs/javax.inject-1.jar:/opt/kafka/bin/../libs/javax.inject-2.5.0-b42.jar:/opt/kafka/bin/../libs/javax.servlet-api-3.1.0.jar:/opt/kafka/bin/../libs/javax.ws.rs-api-2.1.1.jar:/opt/kafka/bin/../libs/javax.ws.rs-api-2.1.jar:/opt/kafka/bin/../libs/jaxb-api-2.3.0.jar:/opt/kafka/bin/../libs/jersey-client-2.27.jar:/opt/kafka/bin/../libs/jersey-common-2.27.jar:/opt/kafka/bin/../libs/jersey-container-servlet-2.27.jar:/opt/kafka/bin/../libs/jersey-container-servlet-core-2.27.jar:/opt/kafka/bin/../libs/jersey-hk2-2.27.jar:/opt/kafka/bin/../libs/jersey-media-jaxb-2.27.jar:/opt/kafka/bin/../libs/jersey-server-2.27.jar:/opt/kafka/bin/../libs/jetty-client-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-continuation-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-http-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-io-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-security-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-server-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-servlet-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-servlets-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jetty-util-9.4.14.v20181114.jar:/opt/kafka/bin/../libs/jopt-simple-5.0.4.jar:/opt/kafka/bin/../libs/kafka_2.11-2.2.0.jar:/opt/kafka/bin/../libs/kafka_2.11-2.2.0-sources.jar:/opt/kafka/bin/../libs/kafka-clients-2.2.0.jar:/opt/kafka/bin/../libs/kafka-log4j-appender-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-examples-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-scala_2.11-2.2.0.jar:/opt/kafka/bin/../libs/kafka-streams-test-utils-2.2.0.jar:/opt/kafka/bin/../libs/kafka-tools-2.2.0.jar:/opt/kafka/bin/../libs/log4j-1.2.17.jar:/opt/kafka/bin/../libs/lz4-java-1.5.0.jar:/opt/kafka/bin/../libs/maven-artifact-3.6.0.jar:/opt/kafka/bin/../libs/metrics-core-2.2.0.jar:/opt/kafka/bin/../libs/osgi-resource-locator-1.0.1.jar:/opt/kafka/bin/../libs/plexus-utils-3.1.0.jar:/opt/kafka/bin/../libs/reflections-0.9.11.jar:/opt/kafka/bin/../libs/rocksdbjni-5.15.10.jar:/opt/kafka/bin/../libs/scala-library-2.11.12.jar:/opt/kafka/bin/../libs/scala-logging_2.11-3.9.0.jar:/opt/kafka/bin/../libs/scala-reflect-2.11.12.jar:/opt/kafka/bin/../libs/slf4j-api-1.7.25.jar:/opt/kafka/bin/../libs/slf4j-log4j12-1.7.25.jar:/opt/kafka/bin/../libs/snappy-java-1.1.7.2.jar:/opt/kafka/bin/../libs/validation-api-1.1.0.Final.jar:/opt/kafka/bin/../libs/zkclient-0.11.jar:/opt/kafka/bin/../libs/zookeeper-3.4.13.jar:/opt/kafka/bin/../libs/zstd-jni-1.3.8-1.jar kafka.Kafka /opt/kafka/config/server.properties> 
> 
> 
> # Broker conf> 
> auto.create.topics.enable=false> 
> broker.id=1> 
> compression.type=snappy> 
> delete.topic.enable=true> 
> listeners=SASL_SSL://:9093> 
> log.dir=/var/lib/kafka> 
> min.insync.replicas=2> 
> sasl.enabled.mechanisms=SCRAM-SHA-512,PLAIN> 
> sasl.mechanism.inter.broker.protocol=SCRAM-SHA-512> 
> security.inter.broker.protocol=SASL_SSL> 
> ssl.client.auth=required> 
> ssl.enabled.protocols=TLSv1.2> 
> ssl.endpoint.identification.algorithm=> 
> ssl.key.password=xxx> 
> ssl.keystore.location=/opt/kafka/ssl/kafka.server.keystore.jks> 
> ssl.keystore.password=xxx> 
> ssl.keystore.type=JKS> 
> ssl.secure.random.implementation=SHA1PRNG> 
> ssl.truststore.location=/opt/kafka/ssl/kafka.server.truststore.jks> 
> ssl.truststore.password=xxx> 
> ssl.truststore.type=JKS> 
> 
> # /opt/kafka/config/kafka_server_jaas.conf> 
> KafkaServer {> 
>  org.apache.kafka.common.security.scram.ScramLoginModule required> 
>  username="admin"> 
>  password="adminpass";> 
> 
>  org.apache.kafka.common.security.plain.PlainLoginModule required> 
>  username="admin"> 
>  password="adminpass"> 
>  user_admin="adminpass"> 
>  user_app="blabla";> 
> };> 
> 
>  

Lep pozdrav / Kind regards,
Ivan Janeš
System Engineer

Email: ivan.janes@oryxgaming.com
Skype: ivan.oryx
Mobile: +386 31 862 833




The information transmitted is intended only for the person or entity to which it is addressed and may contain confidential and/or privileged material. Any review, re-transmission, dissemination or other use of, or taking of any action in reliance upon, this information by persons or entities other than the intended recipient is prohibited. If you received this in error, please contact the sender and delete the material from any computer.