You are viewing a plain text version of this content. The canonical link for it is here.
Posted to reviews@ambari.apache.org by Andrii Babiichuk <ab...@hortonworks.com> on 2017/03/23 17:41:56 UTC

Review Request 57885: create table failing with HiveAccessControlException (additional patch)

-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/57885/
-----------------------------------------------------------

Review request for Ambari and Aleksandr Kovalenko.


Bugs: AMBARI-20516
    https://issues.apache.org/jira/browse/AMBARI-20516


Repository: ambari


Description
-------

Create table query is failing with below error

```
2017-03-02T03:39:11,076 ERROR [970bf75a-09fc-4b25-9bdb-9019cd135966 HiveServer2-Handler-Pool: Thread-349]: authorizer.RangerHiveAuthorizer (RangerHiveAuthorizer.java:isURIAccessAllowed(1087)) - Error getting permissions for hdfs://cluster-hostname-04:8020/user/hcat/autoquerygen_data/data/table_1
java.lang.reflect.UndeclaredThrowableException
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1884) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.hive.common.FileUtils.checkFileAccessWithImpersonation(FileUtils.java:391) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.common.FileUtils.isActionPermittedForFileHierarchy(FileUtils.java:431) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.common.FileUtils.isActionPermittedForFileHierarchy(FileUtils.java:417) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.isURIAccessAllowed(RangerHiveAuthorizer.java:1070) [ranger-hive-plugin-0.7.0.2.6.0.0-571.jar:0.7.0.2.6.0.0-571]
        at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:251) [ranger-hive-plugin-0.7.0.2.6.0.0-571.jar:0.7.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:846) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:633) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:492) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:336) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1197) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1184) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:191) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:276) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.operation.Operation.run(Operation.java:312) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:508) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:495) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:308) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:506) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1437) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1422) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:599) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112]
        at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
Caused by: java.lang.reflect.InvocationTargetException
        at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source) ~[?:?]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at org.apache.hadoop.hive.shims.Hadoop23Shims.checkFileAccess(Hadoop23Shims.java:943) ~[hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.common.FileUtils$3.run(FileUtils.java:395) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]
        at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        ... 27 more
Caused by: org.apache.hadoop.ipc.RemoteException: Unauthorized connection for super-user: hive/cluster-hostname-07@HWQE.HORTONWORKS.COM from IP 172.27.24.4
        at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at com.sun.proxy.$Proxy37.checkAccess(Unknown Source) ~[?:?]
        at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.checkAccess(ClientNamenodeProtocolTranslatorPB.java:1537) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
        at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source) ~[?:?]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:282) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at com.sun.proxy.$Proxy38.checkAccess(Unknown Source) ~[?:?]
        at org.apache.hadoop.hdfs.DFSClient.checkAccess(DFSClient.java:3511) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.hdfs.DistributedFileSystem$55.doCall(DistributedFileSystem.java:2384) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.hdfs.DistributedFileSystem$55.doCall(DistributedFileSystem.java:2381) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        at org.apache.hadoop.hdfs.DistributedFileSystem.access(DistributedFileSystem.java:2381) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
        at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source) ~[?:?]
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
        at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
        at org.apache.hadoop.hive.shims.Hadoop23Shims.checkFileAccess(Hadoop23Shims.java:943) ~[hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at org.apache.hadoop.hive.common.FileUtils$3.run(FileUtils.java:395) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
        at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]
        at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]
        at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
        ... 27 more
```

Query:


```
CREATE EXTERNAL TABLE table_1_txt (timestamp_col_1 TIMESTAMP, decimal3003_col_2 DECIMAL(30, 3), tinyint_col_3 TINYINT, decimal0101_col_4 DECIMAL(1, 1), boolean_col_5 BOOLEAN, float_col_6 FLOAT, bigint_col_7 BIGINT, varchar0098_col_8 VARCHAR(98), timestamp_col_9 TIMESTAMP, bigint_col_10 BIGINT, decimal0903_col_11 DECIMAL(9, 3), timestamp_col_12 TIMESTAMP, timestamp_col_13 TIMESTAMP, float_col_14 FLOAT, char0254_col_15 CHAR(254), double_col_16 DOUBLE, timestamp_col_17 TIMESTAMP, boolean_col_18 BOOLEAN, decimal2608_col_19 DECIMAL(26, 8), varchar0216_col_20 VARCHAR(216), string_col_21 STRING, bigint_col_22 BIGINT, boolean_col_23 BOOLEAN, timestamp_col_24 TIMESTAMP, boolean_col_25 BOOLEAN, decimal2016_col_26 DECIMAL(20, 16), string_col_27 STRING, decimal0202_col_28 DECIMAL(2, 2), float_col_29 FLOAT, decimal2020_col_30 DECIMAL(20, 20), boolean_col_31 BOOLEAN, double_col_32 DOUBLE, varchar0148_col_33 VARCHAR(148), decimal2121_col_34 DECIMAL(21, 21), tinyint_col_35 TINYINT, boolean_col_36 BO
 OLEAN, boolean_col_37 BOOLEAN, string_col_38 STRING, decimal3420_col_39 DECIMAL(34, 20), timestamp_col_40 TIMESTAMP, decimal1408_col_41 DECIMAL(14, 8), string_col_42 STRING, decimal0902_col_43 DECIMAL(9, 2), varchar0204_col_44 VARCHAR(204), boolean_col_45 BOOLEAN, timestamp_col_46 TIMESTAMP, boolean_col_47 BOOLEAN, bigint_col_48 BIGINT, boolean_col_49 BOOLEAN, smallint_col_50 SMALLINT, decimal0704_col_51 DECIMAL(7, 4), timestamp_col_52 TIMESTAMP, boolean_col_53 BOOLEAN, timestamp_col_54 TIMESTAMP, int_col_55 INT, decimal0505_col_56 DECIMAL(5, 5), char0155_col_57 CHAR(155), boolean_col_58 BOOLEAN, bigint_col_59 BIGINT, boolean_col_60 BOOLEAN, boolean_col_61 BOOLEAN, char0249_col_62 CHAR(249), boolean_col_63 BOOLEAN, timestamp_col_64 TIMESTAMP, decimal1309_col_65 DECIMAL(13, 9), int_col_66 INT, float_col_67 FLOAT, timestamp_col_68 TIMESTAMP, timestamp_col_69 TIMESTAMP, boolean_col_70 BOOLEAN, timestamp_col_71 TIMESTAMP, double_col_72 DOUBLE, boolean_col_73 BOOLEAN, char0222_col_74 CHA
 R(222), float_col_75 FLOAT, string_col_76 STRING, decimal2612_col_77 DECIMAL(26, 12), timestamp_col_78 TIMESTAMP, char0128_col_79 CHAR(128), timestamp_col_80 TIMESTAMP, double_col_81 DOUBLE, timestamp_col_82 TIMESTAMP, float_col_83 FLOAT, decimal2622_col_84 DECIMAL(26, 22), double_col_85 DOUBLE, float_col_86 FLOAT, decimal0907_col_87 DECIMAL(9, 7)) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LOCATION '/user/hcat/autoquerygen_data/data/table_1'
```

From hdfs the folder permissions could be seen to be set to:


```
hrt_qa@cluster-hostname-07:/hwqe/hadoopqe$ hdfs dfs -ls /user/hcat/autoquerygen_data/data
Found 22 items
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:48 /user/hcat/autoquerygen_data/data/table_1
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_10
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_11
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_12
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_13
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_14
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_15
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_16
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_17
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_18
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_19
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:48 /user/hcat/autoquerygen_data/data/table_2
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_20
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_21
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_22
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_3
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_4
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_5
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_6
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_7
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_8
drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_9
```

Even with a drwxrwxrwx permission not sure why hrt_qa is denied access to the folder

Test setup is setting the permissions and owner for the folder. Upon changing the ownership to 'hrt_qa' instead of 'hive' user tests are query is passing fine

This issue seems to be appearing only in Ubuntu16 run


Diffs
-----

  ambari-web/app/controllers/wizard/step7/assign_master_controller.js 38e3b27 
  ambari-web/app/models/configs/theme/config_action.js b2ba09a 
  ambari-web/app/utils/configs/add_component_config_initializer.js 21fb6b4 
  ambari-web/app/utils/configs/move_hm_config_initializer.js ab150e9 
  ambari-web/app/utils/configs/move_hs_config_initializer.js 7b15d0d 
  ambari-web/test/controllers/wizard/step7/assign_master_controller_test.js 0f3c599 


Diff: https://reviews.apache.org/r/57885/diff/1/


Testing
-------

20602 passing (21s)
  128 pending


Thanks,

Andrii Babiichuk


Re: Review Request 57885: create table failing with HiveAccessControlException (additional patch)

Posted by Aleksandr Kovalenko <ak...@hortonworks.com>.
-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/57885/#review169901
-----------------------------------------------------------


Ship it!




Ship It!

- Aleksandr Kovalenko


On \u041c\u0430\u0440\u0442 23, 2017, 5:41 \u043f.\u043f., Andrii Babiichuk wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/57885/
> -----------------------------------------------------------
> 
> (Updated \u041c\u0430\u0440\u0442 23, 2017, 5:41 \u043f.\u043f.)
> 
> 
> Review request for Ambari and Aleksandr Kovalenko.
> 
> 
> Bugs: AMBARI-20516
>     https://issues.apache.org/jira/browse/AMBARI-20516
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Create table query is failing with below error
> 
> ```
> 2017-03-02T03:39:11,076 ERROR [970bf75a-09fc-4b25-9bdb-9019cd135966 HiveServer2-Handler-Pool: Thread-349]: authorizer.RangerHiveAuthorizer (RangerHiveAuthorizer.java:isURIAccessAllowed(1087)) - Error getting permissions for hdfs://cluster-hostname-04:8020/user/hcat/autoquerygen_data/data/table_1
> java.lang.reflect.UndeclaredThrowableException
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1884) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.hive.common.FileUtils.checkFileAccessWithImpersonation(FileUtils.java:391) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.common.FileUtils.isActionPermittedForFileHierarchy(FileUtils.java:431) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.common.FileUtils.isActionPermittedForFileHierarchy(FileUtils.java:417) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.isURIAccessAllowed(RangerHiveAuthorizer.java:1070) [ranger-hive-plugin-0.7.0.2.6.0.0-571.jar:0.7.0.2.6.0.0-571]
>         at org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizer.checkPrivileges(RangerHiveAuthorizer.java:251) [ranger-hive-plugin-0.7.0.2.6.0.0-571.jar:0.7.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.doAuthorizationV2(Driver.java:846) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.doAuthorization(Driver.java:633) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:492) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:336) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1197) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:1184) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:191) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:276) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.operation.Operation.run(Operation.java:312) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:508) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementAsync(HiveSessionImpl.java:495) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.CLIService.executeStatementAsync(CLIService.java:308) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:506) [hive-service-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1437) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hive.service.rpc.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1422) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:599) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:286) [hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_112]
>         at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_112]
>         at java.lang.Thread.run(Thread.java:745) [?:1.8.0_112]
> Caused by: java.lang.reflect.InvocationTargetException
>         at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source) ~[?:?]
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
>         at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
>         at org.apache.hadoop.hive.shims.Hadoop23Shims.checkFileAccess(Hadoop23Shims.java:943) ~[hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.common.FileUtils$3.run(FileUtils.java:395) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]
>         at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         ... 27 more
> Caused by: org.apache.hadoop.ipc.RemoteException: Unauthorized connection for super-user: hive/cluster-hostname-07@HWQE.HORTONWORKS.COM from IP 172.27.24.4
>         at org.apache.hadoop.ipc.Client.getRpcResponse(Client.java:1554) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.ipc.Client.call(Client.java:1498) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.ipc.Client.call(Client.java:1398) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:233) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at com.sun.proxy.$Proxy37.checkAccess(Unknown Source) ~[?:?]
>         at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.checkAccess(ClientNamenodeProtocolTranslatorPB.java:1537) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
>         at sun.reflect.GeneratedMethodAccessor39.invoke(Unknown Source) ~[?:?]
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
>         at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:282) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:194) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:176) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at com.sun.proxy.$Proxy38.checkAccess(Unknown Source) ~[?:?]
>         at org.apache.hadoop.hdfs.DFSClient.checkAccess(DFSClient.java:3511) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.hdfs.DistributedFileSystem$55.doCall(DistributedFileSystem.java:2384) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.hdfs.DistributedFileSystem$55.doCall(DistributedFileSystem.java:2381) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         at org.apache.hadoop.hdfs.DistributedFileSystem.access(DistributedFileSystem.java:2381) ~[hadoop-hdfs-2.7.3.2.6.0.0-571.jar:?]
>         at sun.reflect.GeneratedMethodAccessor38.invoke(Unknown Source) ~[?:?]
>         at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[?:1.8.0_112]
>         at java.lang.reflect.Method.invoke(Method.java:498) ~[?:1.8.0_112]
>         at org.apache.hadoop.hive.shims.Hadoop23Shims.checkFileAccess(Hadoop23Shims.java:943) ~[hive-exec-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at org.apache.hadoop.hive.common.FileUtils$3.run(FileUtils.java:395) ~[hive-common-2.1.0.2.6.0.0-571.jar:2.1.0.2.6.0.0-571]
>         at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_112]
>         at javax.security.auth.Subject.doAs(Subject.java:422) ~[?:1.8.0_112]
>         at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1866) ~[hadoop-common-2.7.3.2.6.0.0-571.jar:?]
>         ... 27 more
> ```
> 
> Query:
> 
> 
> ```
> CREATE EXTERNAL TABLE table_1_txt (timestamp_col_1 TIMESTAMP, decimal3003_col_2 DECIMAL(30, 3), tinyint_col_3 TINYINT, decimal0101_col_4 DECIMAL(1, 1), boolean_col_5 BOOLEAN, float_col_6 FLOAT, bigint_col_7 BIGINT, varchar0098_col_8 VARCHAR(98), timestamp_col_9 TIMESTAMP, bigint_col_10 BIGINT, decimal0903_col_11 DECIMAL(9, 3), timestamp_col_12 TIMESTAMP, timestamp_col_13 TIMESTAMP, float_col_14 FLOAT, char0254_col_15 CHAR(254), double_col_16 DOUBLE, timestamp_col_17 TIMESTAMP, boolean_col_18 BOOLEAN, decimal2608_col_19 DECIMAL(26, 8), varchar0216_col_20 VARCHAR(216), string_col_21 STRING, bigint_col_22 BIGINT, boolean_col_23 BOOLEAN, timestamp_col_24 TIMESTAMP, boolean_col_25 BOOLEAN, decimal2016_col_26 DECIMAL(20, 16), string_col_27 STRING, decimal0202_col_28 DECIMAL(2, 2), float_col_29 FLOAT, decimal2020_col_30 DECIMAL(20, 20), boolean_col_31 BOOLEAN, double_col_32 DOUBLE, varchar0148_col_33 VARCHAR(148), decimal2121_col_34 DECIMAL(21, 21), tinyint_col_35 TINYINT, boolean_col_36 
 BOOLEAN, boolean_col_37 BOOLEAN, string_col_38 STRING, decimal3420_col_39 DECIMAL(34, 20), timestamp_col_40 TIMESTAMP, decimal1408_col_41 DECIMAL(14, 8), string_col_42 STRING, decimal0902_col_43 DECIMAL(9, 2), varchar0204_col_44 VARCHAR(204), boolean_col_45 BOOLEAN, timestamp_col_46 TIMESTAMP, boolean_col_47 BOOLEAN, bigint_col_48 BIGINT, boolean_col_49 BOOLEAN, smallint_col_50 SMALLINT, decimal0704_col_51 DECIMAL(7, 4), timestamp_col_52 TIMESTAMP, boolean_col_53 BOOLEAN, timestamp_col_54 TIMESTAMP, int_col_55 INT, decimal0505_col_56 DECIMAL(5, 5), char0155_col_57 CHAR(155), boolean_col_58 BOOLEAN, bigint_col_59 BIGINT, boolean_col_60 BOOLEAN, boolean_col_61 BOOLEAN, char0249_col_62 CHAR(249), boolean_col_63 BOOLEAN, timestamp_col_64 TIMESTAMP, decimal1309_col_65 DECIMAL(13, 9), int_col_66 INT, float_col_67 FLOAT, timestamp_col_68 TIMESTAMP, timestamp_col_69 TIMESTAMP, boolean_col_70 BOOLEAN, timestamp_col_71 TIMESTAMP, double_col_72 DOUBLE, boolean_col_73 BOOLEAN, char0222_col_74 C
 HAR(222), float_col_75 FLOAT, string_col_76 STRING, decimal2612_col_77 DECIMAL(26, 12), timestamp_col_78 TIMESTAMP, char0128_col_79 CHAR(128), timestamp_col_80 TIMESTAMP, double_col_81 DOUBLE, timestamp_col_82 TIMESTAMP, float_col_83 FLOAT, decimal2622_col_84 DECIMAL(26, 22), double_col_85 DOUBLE, float_col_86 FLOAT, decimal0907_col_87 DECIMAL(9, 7)) ROW FORMAT DELIMITED FIELDS TERMINATED BY '\001' LOCATION '/user/hcat/autoquerygen_data/data/table_1'
> ```
> 
> From hdfs the folder permissions could be seen to be set to:
> 
> 
> ```
> hrt_qa@cluster-hostname-07:/hwqe/hadoopqe$ hdfs dfs -ls /user/hcat/autoquerygen_data/data
> Found 22 items
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:48 /user/hcat/autoquerygen_data/data/table_1
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_10
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_11
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_12
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_13
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_14
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_15
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_16
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_17
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:51 /user/hcat/autoquerygen_data/data/table_18
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_19
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:48 /user/hcat/autoquerygen_data/data/table_2
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_20
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_21
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:52 /user/hcat/autoquerygen_data/data/table_22
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_3
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_4
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_5
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_6
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:49 /user/hcat/autoquerygen_data/data/table_7
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_8
> drwxrwxrwx   - hive hdfs          0 2017-03-02 17:50 /user/hcat/autoquerygen_data/data/table_9
> ```
> 
> Even with a drwxrwxrwx permission not sure why hrt_qa is denied access to the folder
> 
> Test setup is setting the permissions and owner for the folder. Upon changing the ownership to 'hrt_qa' instead of 'hive' user tests are query is passing fine
> 
> This issue seems to be appearing only in Ubuntu16 run
> 
> 
> Diffs
> -----
> 
>   ambari-web/app/controllers/wizard/step7/assign_master_controller.js 38e3b27 
>   ambari-web/app/models/configs/theme/config_action.js b2ba09a 
>   ambari-web/app/utils/configs/add_component_config_initializer.js 21fb6b4 
>   ambari-web/app/utils/configs/move_hm_config_initializer.js ab150e9 
>   ambari-web/app/utils/configs/move_hs_config_initializer.js 7b15d0d 
>   ambari-web/test/controllers/wizard/step7/assign_master_controller_test.js 0f3c599 
> 
> 
> Diff: https://reviews.apache.org/r/57885/diff/1/
> 
> 
> Testing
> -------
> 
> 20602 passing (21s)
>   128 pending
> 
> 
> Thanks,
> 
> Andrii Babiichuk
> 
>