You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ozone.apache.org by "Aravindan Vijayan (Jira)" <ji...@apache.org> on 2020/02/26 23:09:00 UTC

[jira] [Updated] (HDDS-3071) Datanodes unable to connect to recon in Secure Environment

     [ https://issues.apache.org/jira/browse/HDDS-3071?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Aravindan Vijayan updated HDDS-3071:
------------------------------------
        Parent: HDDS-1996
    Issue Type: Sub-task  (was: Bug)

> Datanodes unable to connect to recon in Secure Environment
> ----------------------------------------------------------
>
>                 Key: HDDS-3071
>                 URL: https://issues.apache.org/jira/browse/HDDS-3071
>             Project: Hadoop Distributed Data Store
>          Issue Type: Sub-task
>          Components: Ozone Recon
>    Affects Versions: 0.6.0
>            Reporter: Vivek Ratnavel Subramanian
>            Assignee: Aravindan Vijayan
>            Priority: Major
>
> Datanodes throw this exception while connecting to recon.
> {code:java}
> datanode_1  | java.io.IOException: DestHost:destPort recon:9891 , LocalHost:localPort 6a99ad69685d/192.168.48.4:0. Failed on local exception: java.io.IOException: Couldn't set up IO streams: java.lang.IllegalArgumentException: Empty nameString not alloweddatanode_1  | java.io.IOException: DestHost:destPort recon:9891 , LocalHost:localPort 6a99ad69685d/192.168.48.4:0. Failed on local exception: java.io.IOException: Couldn't set up IO streams: java.lang.IllegalArgumentException: Empty nameString not alloweddatanode_1  |  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)datanode_1  |  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)datanode_1  |  at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)datanode_1  |  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)datanode_1  |  at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)datanode_1  |  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)datanode_1  |  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1515)datanode_1  |  at org.apache.hadoop.ipc.Client.call(Client.java:1457)datanode_1  |  at org.apache.hadoop.ipc.Client.call(Client.java:1367)datanode_1  |  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)datanode_1  |  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)datanode_1  |  at com.sun.proxy.$Proxy40.submitRequest(Unknown Source)datanode_1  |  at org.apache.hadoop.ozone.protocolPB.StorageContainerDatanodeProtocolClientSideTranslatorPB.submitRequest(StorageContainerDatanodeProtocolClientSideTranslatorPB.java:116)datanode_1  |  at org.apache.hadoop.ozone.protocolPB.StorageContainerDatanodeProtocolClientSideTranslatorPB.getVersion(StorageContainerDatanodeProtocolClientSideTranslatorPB.java:132)datanode_1  |  at org.apache.hadoop.ozone.container.common.states.endpoint.VersionEndpointTask.call(VersionEndpointTask.java:71)datanode_1  |  at org.apache.hadoop.ozone.container.common.states.endpoint.VersionEndpointTask.call(VersionEndpointTask.java:42)datanode_1  |  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)datanode_1  |  at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)datanode_1  |  at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)datanode_1  |  at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)datanode_1  |  at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)datanode_1  |  at java.base/java.lang.Thread.run(Thread.java:834)datanode_1  | Caused by: java.io.IOException: Couldn't set up IO streams: java.lang.IllegalArgumentException: Empty nameString not alloweddatanode_1  |  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:866)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection.access$3700(Client.java:411)datanode_1  |  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1572)datanode_1  |  at org.apache.hadoop.ipc.Client.call(Client.java:1403)datanode_1  |  ... 14 moredatanode_1  | Caused by: java.lang.IllegalArgumentException: Empty nameString not alloweddatanode_1  |  at java.security.jgss/sun.security.krb5.PrincipalName.validateNameStrings(PrincipalName.java:174)datanode_1  |  at java.security.jgss/sun.security.krb5.PrincipalName.<init>(PrincipalName.java:397)datanode_1  |  at java.security.jgss/sun.security.krb5.PrincipalName.<init>(PrincipalName.java:471)datanode_1  |  at java.security.jgss/javax.security.auth.kerberos.KerberosPrincipal.<init>(KerberosPrincipal.java:172)datanode_1  |  at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:305)datanode_1  |  at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:234)datanode_1  |  at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:160)datanode_1  |  at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:617)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:411)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800)datanode_1  |  at java.base/java.security.AccessController.doPrivileged(Native Method)datanode_1  |  at java.base/javax.security.auth.Subject.doAs(Subject.java:423)datanode_1  |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)datanode_1  |  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800)datanode_1  |  ... 17 more
> {code}
> Recon throws an exception while connecting to SCM:
> {code:java}
> recon_1     | 2020-02-25 17:48:14,506 [main] ERROR scm.ReconStorageContainerManagerFacade: Exception encountered while getting pipelines from SCM.recon_1     | 2020-02-25 17:48:14,506 [main] ERROR scm.ReconStorageContainerManagerFacade: Exception encountered while getting pipelines from SCM.recon_1     | java.io.IOException: DestHost:destPort scm:9860 , LocalHost:localPort recon/192.168.48.8:0. Failed on local exception: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[KERBEROS]recon_1     |  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)recon_1     |  at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)recon_1     |  at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)recon_1     |  at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)recon_1     |  at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:831)recon_1     |  at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:806)recon_1     |  at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1515)recon_1     |  at org.apache.hadoop.ipc.Client.call(Client.java:1457)recon_1     |  at org.apache.hadoop.ipc.Client.call(Client.java:1367)recon_1     |  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:228)recon_1     |  at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:116)recon_1     |  at com.sun.proxy.$Proxy41.submitRequest(Unknown Source)recon_1     |  at org.apache.hadoop.hdds.scm.protocolPB.StorageContainerLocationProtocolClientSideTranslatorPB.submitRequest(StorageContainerLocationProtocolClientSideTranslatorPB.java:114)recon_1     |  at org.apache.hadoop.hdds.scm.protocolPB.StorageContainerLocationProtocolClientSideTranslatorPB.listPipelines(StorageContainerLocationProtocolClientSideTranslatorPB.java:322)recon_1     |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)recon_1     |  at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)recon_1     |  at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)recon_1     |  at java.base/java.lang.reflect.Method.invoke(Method.java:566)recon_1     |  at org.apache.hadoop.hdds.tracing.TraceAllMethod.invoke(TraceAllMethod.java:71)recon_1     |  at com.sun.proxy.$Proxy42.listPipelines(Unknown Source)recon_1     |  at org.apache.hadoop.ozone.recon.spi.impl.StorageContainerServiceProviderImpl.getPipelines(StorageContainerServiceProviderImpl.java:49)recon_1     |  at org.apache.hadoop.ozone.recon.scm.ReconStorageContainerManagerFacade.initializePipelinesFromScm(ReconStorageContainerManagerFacade.java:223)recon_1     |  at org.apache.hadoop.ozone.recon.scm.ReconStorageContainerManagerFacade.start(ReconStorageContainerManagerFacade.java:183)recon_1     |  at org.apache.hadoop.ozone.recon.ReconServer.start(ReconServer.java:118)recon_1     |  at org.apache.hadoop.ozone.recon.ReconServer.call(ReconServer.java:95)recon_1     |  at org.apache.hadoop.ozone.recon.ReconServer.call(ReconServer.java:39)recon_1     |  at picocli.CommandLine.execute(CommandLine.java:1173)recon_1     |  at picocli.CommandLine.access$800(CommandLine.java:141)recon_1     |  at picocli.CommandLine$RunLast.handle(CommandLine.java:1367)recon_1     |  at picocli.CommandLine$RunLast.handle(CommandLine.java:1335)recon_1     |  at picocli.CommandLine$AbstractParseResultHandler.handleParseResult(CommandLine.java:1243)recon_1     |  at picocli.CommandLine.parseWithHandlers(CommandLine.java:1526)recon_1     |  at picocli.CommandLine.parseWithHandler(CommandLine.java:1465)recon_1     |  at org.apache.hadoop.hdds.cli.GenericCli.execute(GenericCli.java:65)recon_1     |  at org.apache.hadoop.hdds.cli.GenericCli.run(GenericCli.java:56)recon_1     |  at org.apache.hadoop.ozone.recon.ReconServer.main(ReconServer.java:52)recon_1     | Caused by: java.io.IOException: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[KERBEROS]recon_1     |  at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:760)recon_1     |  at java.base/java.security.AccessController.doPrivileged(Native Method)recon_1     |  at java.base/javax.security.auth.Subject.doAs(Subject.java:423)recon_1     |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:723)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:817)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.access$3700(Client.java:411)recon_1     |  at org.apache.hadoop.ipc.Client.getConnection(Client.java:1572)recon_1     |  at org.apache.hadoop.ipc.Client.call(Client.java:1403)recon_1     |  ... 28 morerecon_1     | Caused by: org.apache.hadoop.security.AccessControlException: Client cannot authenticate via:[KERBEROS]recon_1     |  at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:173)recon_1     |  at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:390)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:617)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.access$2300(Client.java:411)recon_1     |  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:804)recon_1     |  at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:800)recon_1     |  at java.base/java.security.AccessController.doPrivileged(Native Method)recon_1     |  at java.base/javax.security.auth.Subject.doAs(Subject.java:423)recon_1     |  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)recon_1     |  at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:800)recon_1     |  ... 31 more
> {code}
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: ozone-issues-unsubscribe@hadoop.apache.org
For additional commands, e-mail: ozone-issues-help@hadoop.apache.org