You are viewing a plain text version of this content. The canonical link for it is here.
Posted to mapreduce-user@hadoop.apache.org by gzlishuang <gz...@corp.netease.com> on 2014/12/15 04:16:01 UTC

configuring hue with the issue on "Server has invalid Kerberos principal"

I have a kerberized cluster of hadoop,which works fine for a long time. It is a CDH version of 5.1.2

Lately, i want to add oozie as another component of my cluster.

Everything goes fine until this step:


hadoop@gdc-dn06-69:~/oozie$  ./bin/oozie-setup.sh sharelib create -fs hdfs://10.120.69.100:9000 -locallib ~/share/

It is To install the Oozie shared library in Hadoop HDFS in the oozie user home directory.

the error log is as follows:

hadoop@gdc-dn06-69:~/oozie$ ./bin/oozie-setup.sh sharelib create -fs hdfs://10.120.69.100:9000 -locallib ~/share/
 setting CATALINA_OPTS="$CATALINA_OPTS -Xmx1024m"
 setting OOZIE_LOG=/home/hadoop/logs
 setting OOZIE_PID=/home/hadoop/pids
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/home/hadoop/src/oozie-4.0.0-cdh5.1.2/libtools/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/home/hadoop/src/oozie-4.0.0-cdh5.1.2/libtools/slf4j-simple-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
the destination path for sharelib is: /user/hadoop/share/lib/lib_20141215105841

Error: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hadoop/gdc-nn01-69.i.nease.net@NIE.NETEASE.COM; Host Details : local host is: "gdc-dn06-69.i.nease.net/10.120.69.106"; destination host is: "gdc-nn01-69.i.nease.net":9000;

Stack trace for the error was (for debug purposes):
--------------------------------------
java.io.IOException: Failed on local exception: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hadoop/gdc-nn01-69.i.nease.net@NIE.NETEASE.COM; Host Details : local host is: "gdc-dn06-69.i.nease.net/10.120.69.106"; destination host is: "gdc-nn01-69.i.nease.net":9000;
	at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:764)
	at org.apache.hadoop.ipc.Client.call(Client.java:1413)
	at org.apache.hadoop.ipc.Client.call(Client.java:1362)
	at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
	at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:606)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:186)
	at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
	at com.sun.proxy.$Proxy9.getFileInfo(Unknown Source)
	at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:701)
	at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1758)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
	at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
	at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
	at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
	at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
	at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
	at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
	at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1903)
	at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1871)
	at org.apache.oozie.tools.OozieSharelibCLI.run(OozieSharelibCLI.java:165)
	at org.apache.oozie.tools.OozieSharelibCLI.main(OozieSharelibCLI.java:56)
Caused by: java.io.IOException: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hadoop/gdc-nn01-69.i.nease.net@NIE.NETEASE.COM
	at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:677)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
	at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:640)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:724)
	at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
	at org.apache.hadoop.ipc.Client.getConnection(Client.java:1461)
	at org.apache.hadoop.ipc.Client.call(Client.java:1380)
	... 24 more
Caused by: java.lang.IllegalArgumentException: Server has invalid Kerberos principal: hadoop/gdc-nn01-69.i.nease.net@NIE.NETEASE.COM
	at org.apache.hadoop.security.SaslRpcClient.getServerPrincipal(SaslRpcClient.java:331)
	at org.apache.hadoop.security.SaslRpcClient.createSaslClient(SaslRpcClient.java:228)
	at org.apache.hadoop.security.SaslRpcClient.selectSaslClient(SaslRpcClient.java:157)
	at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:393)
	at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:550)
	at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:367)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:716)
	at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:712)
	at java.security.AccessController.doPrivileged(Native Method)
	at javax.security.auth.Subject.doAs(Subject.java:415)
	at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1554)
	at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:711)
	... 27 more


i have searched google for a long time, but in vain.

Could anyone help me?