You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@drill.apache.org by Joseph Swingle <jo...@gmail.com> on 2016/07/05 18:29:22 UTC

Re: Drill - Hive - Kerberos

Yes.  Impersonation is enabled.:

drill.exec: {

  cluster-id: "hhe",

  zk.connect: "zk1:2181,zk22181,zk3:2181"

  impersonation: {

      enabled: true,

      max_chained_user_hops: 3

  }

}

On Mon, Jun 20, 2016 at 6:22 PM, Chun Chang <cc...@maprtech.com> wrote:

> Did you enable impersonation? Check the drill-override.conf file to verify
> that impersonation is enabled.
>
> On Mon, Jun 20, 2016 at 5:17 AM, Joseph Swingle <jo...@gmail.com>
> wrote:
>
> > Yes secure cluster.  Strange that I can browse hdfs, and can get the
> > metadata about hive database and tables.
> > But every sql query to pull data from hive tables results in that error.
> >
> >
> >
> >
> > > On Jun 17, 2016, at 6:24 PM, Chun Chang <cc...@maprtech.com> wrote:
> > >
> > > Hi Joseph,
> > >
> > > Are you running DRILL on a secure cluster? I had success with the
> > following
> > > storage plugin configuration with MapR distribution, SQL standard
> > > authorization with Kerberos:
> > >
> > > hive storage plugin:
> > >
> > > {
> > >
> > > "type": "hive",
> > >
> > > "enabled": true,
> > >
> > > "configProps": {
> > >
> > >   "hive.metastore.uris": "thrift://10.10.100.120:9083",
> > >
> > >   "fs.default.name": "maprfs:///",
> > >
> > >   "hive.server2.enable.doAs": "false",
> > >
> > >   "hive.metastore.sasl.enabled": "true",
> > >
> > >   "hive.metastore.kerberos.principal":
> > "hive/bigdata-node120.bd.lab@BD.LAB"
> > >
> > > }
> > >
> > > }
> > >
> > >
> > > On Fri, Jun 17, 2016 at 1:28 PM, Joseph Swingle <jo...@gmail.com>
> > > wrote:
> > >
> > >> I have a Hive Storage plugin configured (bottom).   I am using HDP 2.4
> > w/
> > >> Hive 1.2.1, Drill 1.6
> > >>
> > >> I can connect just fine with Drill Explorer.  I can browse, and view
> > >> content on hdfs just fine with Drill Explorer.  The .csv files etc,
> > display
> > >> fine.
> > >>
> > >> I can browse to see the list of schemas in Hive just fine with Drill
> > >> Explorer.  But every SQL query (for example “select * from foo )
> > returns:
> > >> Caused by: java.io.IOException: Failed to get numRows from HiveTable
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider.getStats(HiveMetadataProvider.java:113)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        at
> > >>
> > org.apache.drill.exec.store.hive.HiveScan.getScanStats(HiveScan.java:224)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        ... 44 common frames omitted
> > >> Caused by: org.apache.drill.common.exceptions.DrillRuntimeException:
> > >> Failed to create input splits: Can't get Master Kerberos principal for
> > use
> > >> as renewer
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider.splitInputWithUGI(HiveMetadataProvider.java:264)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider.getTableInputSplits(HiveMetadataProvider.java:128)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider.getStats(HiveMetadataProvider.java:96)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        ... 45 common frames omitted
> > >> Caused by: java.io.IOException: Can't get Master Kerberos principal
> for
> > >> use as renewer
> > >>        at
> > >>
> >
> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:116)
> > >> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
> > >> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
> > >> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
> > >> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
> > >> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider$1.run(HiveMetadataProvider.java:253)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider$1.run(HiveMetadataProvider.java:241)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        at java.security.AccessController.doPrivileged(Native Method)
> > >> ~[na:1.8.0_45]
> > >>        at javax.security.auth.Subject.doAs(Subject.java:422)
> > >> ~[na:1.8.0_45]
> > >>        at
> > >>
> >
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
> > >> ~[hadoop-common-2.7.1.jar:na]
> > >>        at
> > >>
> >
> org.apache.drill.exec.store.hive.HiveMetadataProvider.splitInputWithUGI(HiveMetadataProvider.java:241)
> > >> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
> > >>        ... 47 common frames omitted
> > >>
> > >>
> > >>
> > >>
> > >> {
> > >>  "type": "hive",
> > >>  "enabled": true,
> > >>  "configProps": {
> > >>    "hive.metastore.uris":
> > >>
> >
> "thrift://<redacted>:9083,thrift://<redacted>:9083,thrift://<redacted>:9083",
> > >>    "javax.jdo.option.ConnectionURL":
> > >>
> "jdbc:derby:;databaseName=../hive-drill-data/drill_hive_db;create=true",
> > >>    "hive.metastore.warehouse.dir": "/apps/hive/warehouse",
> > >>    "fs.default.name": "hdfs://<redacted>:8020",
> > >>    "hive.metastore.sasl.enabled": "true",
> > >>    "hive.security.authorization.enabled": "false",
> > >>    "hive.server2.enable.doAs": "true",
> > >>    "hive.metastore.kerberos.keytab.file":
> > >> "/etc/security/keytabs/hive.service.keytab",
> > >>    "hive.metastore.kerberos.principal": "hive/<redacted>@ACT.LOCAL",
> > >>    "hive.security.authorization.manager":
> > >>
> >
> "org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory"
> > >>  }
> > >> }
> > >>
> > >> Any help even if to simply point in right direction is appreciated.
> >
> >
>

Re: Drill - Hive - Kerberos

Posted by Sudheesh Katkam <sk...@maprtech.com>.
Can you set the following property to false in the Hive storage plugin configuration, and try again?

"hive.server2.enable.doAs": “false"

Thank you,
Sudheesh

> On Jul 5, 2016, at 11:29 AM, Joseph Swingle <jo...@gmail.com> wrote:
> 
> Yes.  Impersonation is enabled.:
> 
> drill.exec: {
> 
>  cluster-id: "hhe",
> 
>  zk.connect: "zk1:2181,zk22181,zk3:2181"
> 
>  impersonation: {
> 
>      enabled: true,
> 
>      max_chained_user_hops: 3
> 
>  }
> 
> }
> 
> On Mon, Jun 20, 2016 at 6:22 PM, Chun Chang <cc...@maprtech.com> wrote:
> 
>> Did you enable impersonation? Check the drill-override.conf file to verify
>> that impersonation is enabled.
>> 
>> On Mon, Jun 20, 2016 at 5:17 AM, Joseph Swingle <jo...@gmail.com>
>> wrote:
>> 
>>> Yes secure cluster.  Strange that I can browse hdfs, and can get the
>>> metadata about hive database and tables.
>>> But every sql query to pull data from hive tables results in that error.
>>> 
>>> 
>>> 
>>> 
>>>> On Jun 17, 2016, at 6:24 PM, Chun Chang <cc...@maprtech.com> wrote:
>>>> 
>>>> Hi Joseph,
>>>> 
>>>> Are you running DRILL on a secure cluster? I had success with the
>>> following
>>>> storage plugin configuration with MapR distribution, SQL standard
>>>> authorization with Kerberos:
>>>> 
>>>> hive storage plugin:
>>>> 
>>>> {
>>>> 
>>>> "type": "hive",
>>>> 
>>>> "enabled": true,
>>>> 
>>>> "configProps": {
>>>> 
>>>>  "hive.metastore.uris": "thrift://10.10.100.120:9083",
>>>> 
>>>>  "fs.default.name": "maprfs:///",
>>>> 
>>>>  "hive.server2.enable.doAs": "false",
>>>> 
>>>>  "hive.metastore.sasl.enabled": "true",
>>>> 
>>>>  "hive.metastore.kerberos.principal":
>>> "hive/bigdata-node120.bd.lab@BD.LAB"
>>>> 
>>>> }
>>>> 
>>>> }
>>>> 
>>>> 
>>>> On Fri, Jun 17, 2016 at 1:28 PM, Joseph Swingle <jo...@gmail.com>
>>>> wrote:
>>>> 
>>>>> I have a Hive Storage plugin configured (bottom).   I am using HDP 2.4
>>> w/
>>>>> Hive 1.2.1, Drill 1.6
>>>>> 
>>>>> I can connect just fine with Drill Explorer.  I can browse, and view
>>>>> content on hdfs just fine with Drill Explorer.  The .csv files etc,
>>> display
>>>>> fine.
>>>>> 
>>>>> I can browse to see the list of schemas in Hive just fine with Drill
>>>>> Explorer.  But every SQL query (for example “select * from foo )
>>> returns:
>>>>> Caused by: java.io.IOException: Failed to get numRows from HiveTable
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider.getStats(HiveMetadataProvider.java:113)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       at
>>>>> 
>>> org.apache.drill.exec.store.hive.HiveScan.getScanStats(HiveScan.java:224)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       ... 44 common frames omitted
>>>>> Caused by: org.apache.drill.common.exceptions.DrillRuntimeException:
>>>>> Failed to create input splits: Can't get Master Kerberos principal for
>>> use
>>>>> as renewer
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider.splitInputWithUGI(HiveMetadataProvider.java:264)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider.getTableInputSplits(HiveMetadataProvider.java:128)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider.getStats(HiveMetadataProvider.java:96)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       ... 45 common frames omitted
>>>>> Caused by: java.io.IOException: Can't get Master Kerberos principal
>> for
>>>>> use as renewer
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:116)
>>>>> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
>>>>> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
>>>>> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
>>>>> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
>>>>> ~[hadoop-mapreduce-client-core-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider$1.run(HiveMetadataProvider.java:253)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider$1.run(HiveMetadataProvider.java:241)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       at java.security.AccessController.doPrivileged(Native Method)
>>>>> ~[na:1.8.0_45]
>>>>>       at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>> ~[na:1.8.0_45]
>>>>>       at
>>>>> 
>>> 
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
>>>>> ~[hadoop-common-2.7.1.jar:na]
>>>>>       at
>>>>> 
>>> 
>> org.apache.drill.exec.store.hive.HiveMetadataProvider.splitInputWithUGI(HiveMetadataProvider.java:241)
>>>>> ~[drill-storage-hive-core-1.6.0.jar:1.6.0]
>>>>>       ... 47 common frames omitted
>>>>> 
>>>>> 
>>>>> 
>>>>> 
>>>>> {
>>>>> "type": "hive",
>>>>> "enabled": true,
>>>>> "configProps": {
>>>>>   "hive.metastore.uris":
>>>>> 
>>> 
>> "thrift://<redacted>:9083,thrift://<redacted>:9083,thrift://<redacted>:9083",
>>>>>   "javax.jdo.option.ConnectionURL":
>>>>> 
>> "jdbc:derby:;databaseName=../hive-drill-data/drill_hive_db;create=true",
>>>>>   "hive.metastore.warehouse.dir": "/apps/hive/warehouse",
>>>>>   "fs.default.name": "hdfs://<redacted>:8020",
>>>>>   "hive.metastore.sasl.enabled": "true",
>>>>>   "hive.security.authorization.enabled": "false",
>>>>>   "hive.server2.enable.doAs": "true",
>>>>>   "hive.metastore.kerberos.keytab.file":
>>>>> "/etc/security/keytabs/hive.service.keytab",
>>>>>   "hive.metastore.kerberos.principal": "hive/<redacted>@ACT.LOCAL",
>>>>>   "hive.security.authorization.manager":
>>>>> 
>>> 
>> "org.apache.ranger.authorization.hive.authorizer.RangerHiveAuthorizerFactory"
>>>>> }
>>>>> }
>>>>> 
>>>>> Any help even if to simply point in right direction is appreciated.
>>> 
>>> 
>>