You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@ignite.apache.org by "Denis Magda (JIRA)" <ji...@apache.org> on 2016/01/11 08:44:39 UTC

[jira] [Updated] (IGNITE-2195) Accessing from IGFS to HDFS that is in kerberised environment

     [ https://issues.apache.org/jira/browse/IGNITE-2195?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Denis Magda updated IGNITE-2195:
--------------------------------
    Attachment: kerbersized_hadoop_fs_factory.zip

> Accessing from IGFS to HDFS that is in kerberised environment
> -------------------------------------------------------------
>
>                 Key: IGNITE-2195
>                 URL: https://issues.apache.org/jira/browse/IGNITE-2195
>             Project: Ignite
>          Issue Type: Bug
>          Components: hadoop, IGFS
>    Affects Versions: ignite-1.4
>            Reporter: Denis Magda
>            Assignee: Ivan Veselovsky
>            Priority: Critical
>              Labels: important
>             Fix For: 1.6
>
>         Attachments: kerbersized_hadoop_fs_factory.zip
>
>
> There is some issue in the current IGFS implementation that doesn't take into account some Kerberos user related settings which leads to the exception below when there is an attempt to work with Kerberised cluster
> {noformat}
> Connecting to HDFS with the following settings [uri=null, cfg=all-site.xml, userName=null]
> log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
> log4j:WARN Please initialize the log4j system properly.
> log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
> org.apache.hadoop.security.AccessControlException: SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
> at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
> at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
> at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
> at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
> at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
> at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
> at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2096)
> at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:944)
> at org.apache.hadoop.hdfs.DistributedFileSystem$DirListingIterator.<init>(DistributedFileSystem.java:927)
> at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:872)
> at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:868)
> at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
> at org.apache.hadoop.hdfs.DistributedFileSystem.listLocatedStatus(DistributedFileSystem.java:868)
> at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1694)
> at org.apache.hadoop.fs.FileSystem$6.<init>(FileSystem.java:1786)
> at org.apache.hadoop.fs.FileSystem.listFiles(FileSystem.java:1783)
> at com.ig.HadoopFsIssue.main(HadoopFsIssue.java:35)
> Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
> at org.apache.hadoop.ipc.Client.call(Client.java:1427)
> at org.apache.hadoop.ipc.Client.call(Client.java:1358)
> at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:229)
> at com.sun.proxy.$Proxy7.getListing(Unknown Source)
> at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:573)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
> at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:606)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)
> at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)
> at com.sun.proxy.$Proxy8.getListing(Unknown Source)
> at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:2094)
> {noformat}
> The issue is fixed in the following way. Need to revisit the fix and check whether it can lead to some other consequences.
> {noformat}
> /**
> * @return {@link org.apache.hadoop.fs.FileSystem}  instance for this secondary Fs.
> * @throws IOException
> */
> public FileSystem createFileSystem(String userName) throws IOException {
>     userName = IgfsUtils.fixUserName(userName);
>     UserGroupInformation.setConfiguration(cfg);
>     UserGroupInformation ugi = UserGroupInformation.createProxyUser(userName, UserGroupInformation.getCurrentUser());
>     try {
>         return ugi.doAs(new PrivilegedExceptionAction<FileSystem>() {
>             @Override
>             public FileSystem run() throws Exception {
>                     return FileSystem.get(uri, cfg);
>             }
>         });
>     } catch (InterruptedException e) {
>         Thread.currentThread().interrupt();
>         throw new IOException("Failed to create file system due to interrupt.", e);
>     }
> }
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)