You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hadoop.apache.org by Hema Kumar Sunapu <he...@gmail.com> on 2015/02/26 13:24:37 UTC

Problem with Running Pig on Secure Hadoop cluster

Hi,

Scenario: Read data from hbase table and store as csv in hdfs.

Problem: I have hdfs and hbase secured with kerberos. Both secured with
different keytab files and with different user name.
If I do *kinit* with hdfs keytab its launching mappers and but not able to
read from hbase table.
Error:  Insufficient permissions for user 'hdfs user' for scanner open on
table 'tableX'

If I do kinit with hbase keytab and run the scpipt. Its not even launching
mappers.
Error: You don't have permission to perform the operation. Error from the
server: Permission denied: user=hbaseuser, access=EXECUTE,
inode="/user":hdfs:hadoop:drwx------

Possible solution can be: If I give permission to /user folder in hdfs and
kinit with Hbase and run the pig script. It may work (I'm not sure).  I
don't want do this.

HBaseStorage class checks for kerbros credentioals. if hasKerberosCredentials
passing it to Job to execute properly. This one failing in my case.

Is there any possibility to login to hbase from pig  using
*loginUserFromKeytab
<https://hadoop.apache.org/docs/r1.1.2/api/org/apache/hadoop/security/UserGroupInformation.html#loginUserFromKeytab(java.lang.String,
java.lang.String)>.? *

Is there any other possible solution?

--Hema Kumar