You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by ram kumar <ra...@gmail.com> on 2016/04/28 07:14:28 UTC

AuthorizationException while exposing via JDBC client (beeline)

Hi,

I wrote a spark job which registers a temp table
and when I expose it via beeline (JDBC client)

$ *./bin/beeline*
beeline>
* !connect jdbc:hive2://IP:10003 -n ram -p xxxx*0: jdbc:hive2://IP>






*show
tables;+---------------------------------------------+--------------+---------------------+|
tableName                          | isTemporary
|+---------------------------------------------+--------------+---------------------+|
f238                                                        | true

|+---------------------------------------------+--------------+---------------------+2
rows selected (0.309 seconds)*0: jdbc:hive2://IP>

I can view the table. When querying I get this error message

0: jdbc:hive2://IP> select * from f238;
*Error:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException):
User: ram is not allowed to impersonate ram (state=,code=0)*
0: jdbc:hive2://IP>

I have this in hive-site.xml,

    <property>
      <name>hive.metastore.sasl.enabled</name>
      <value>false</value>
      <description>If true, the metastore Thrift interface will be secured
with SASL. Clients must authenticate with Kerberos.</description>
    </property>

    <property>
      <name>hive.server2.enable.doAs</name>
      <value>false</value>
    </property>

    <property>
      <name>hive.server2.authentication</name>
      <value>NONE</value>
    </property>


I have this in core-site.xml,

    <property>
      <name>hadoop.proxyuser.hive.groups</name>
      <value>*</value>
    </property>

    <property>
      <name>hadoop.proxyuser.hive.hosts</name>
      <value>*</value>
    </property>


When persisting as a table using saveAsTable, I can able to query via
beeline
Any idea what configuration I am missing?

Thanks

Fwd: AuthorizationException while exposing via JDBC client (beeline)

Posted by ram kumar <ra...@gmail.com>.
Hi,

I wrote a spark job which registers a temp table
and when I expose it via beeline (JDBC client)

$ *./bin/beeline*
beeline>
* !connect jdbc:hive2://IP:10003 -n ram -p xxxx*0: jdbc:hive2://IP>






*show
tables;+---------------------------------------------+--------------+---------------------+|
tableName                          | isTemporary
|+---------------------------------------------+--------------+---------------------+|
f238                                                        | true

|+---------------------------------------------+--------------+---------------------+2
rows selected (0.309 seconds)*0: jdbc:hive2://IP>

I can view the table. When querying I get this error message

0: jdbc:hive2://IP> select * from f238;
*Error:
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.authorize.AuthorizationException):
User: ram is not allowed to impersonate ram (state=,code=0)*
0: jdbc:hive2://IP>

I have this in hive-site.xml,

    <property>
      <name>hive.metastore.sasl.enabled</name>
      <value>false</value>
      <description>If true, the metastore Thrift interface will be secured
with SASL. Clients must authenticate with Kerberos.</description>
    </property>

    <property>
      <name>hive.server2.enable.doAs</name>
      <value>false</value>
    </property>

    <property>
      <name>hive.server2.authentication</name>
      <value>NONE</value>
    </property>


I have this in core-site.xml,

    <property>
      <name>hadoop.proxyuser.hive.groups</name>
      <value>*</value>
    </property>

    <property>
      <name>hadoop.proxyuser.hive.hosts</name>
      <value>*</value>
    </property>


When persisting as a table using saveAsTable, I can able to query via
beeline
Any idea what configuration I am missing?

Thanks