You are viewing a plain text version of this content. The canonical link for it is here.
Posted to common-issues@hadoop.apache.org by "Kihwal Lee (JIRA)" <ji...@apache.org> on 2016/03/17 14:20:33 UTC
[jira] [Commented] (HADOOP-12914) RPC client should deal with the
IP address change
[ https://issues.apache.org/jira/browse/HADOOP-12914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15199507#comment-15199507 ]
Kihwal Lee commented on HADOOP-12914:
-------------------------------------
HDFS-8068 and HADOOP-12125 are related.
> RPC client should deal with the IP address change
> -------------------------------------------------
>
> Key: HADOOP-12914
> URL: https://issues.apache.org/jira/browse/HADOOP-12914
> Project: Hadoop Common
> Issue Type: Bug
> Components: ipc
> Affects Versions: 2.7.2
> Environment: CentOS 7
> Reporter: Michiel Vanderlee
>
> I'm seeing HADOOP-7472 again for the datanode in v2.7.2.
> If I start the datanode before the dns entry the namenode resolve, it never retries to resolve and keeps failing with a UnknownHostException.
> A restart or the datanode fixes this.
> TRACE ipc.ProtobufRpcEngine: 31: Exception <- nn1.hdfs-namenode-rpc.service.consul:8020: versionRequest {java.net.UnknownHostException: Invalid host name: local host is: (unknown); destination host is: "nn1.hdfs-namenode-rpc.service.consul":8020; java.net.UnknownHostException; For more details see: http://wiki.apache.org/hadoop/UnknownHost}
> The error comes from:
> org.apache.hadoop.ipc.Client..java$Connection line: 409
> public Connection(ConnectionId remoteId, int serviceClass) throws IOException {
> this.remoteId = remoteId;
> this.server = remoteId.getAddress();
> if (server.isUnresolved()) {
> throw NetUtils.wrapException(server.getHostName(),
> server.getPort(),
> null,
> 0,
> new UnknownHostException());
> }
> The remoteId.address (InetSocketAddress) seems to only resolves on creation, never again unless done manually.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)