You are viewing a plain text version of this content. The canonical link for it is here.
Posted to dev@sentry.apache.org by sukhendu chakraborty <su...@gmail.com> on 2014/09/23 19:31:00 UTC

Fwd: create function issue using sentry

---------- Forwarded message ----------
From: sukhendu chakraborty <su...@gmail.com>
Date: Tue, Sep 23, 2014 at 9:26 AM
Subject: create function issue using sentry
To: cdh-user@cloudera.org


Hi,
 We are using the cdh-4.5.0 package distribution and are running into the
following error while creating a temporary function using beeline:

jdbc:hive2://localhost:10000> create temporary function arraytoString as
'com.udfs.string.ArrayToStringUDF';
Error: Error while processing statement: FAILED: SemanticException Error
retrieving udf class (state=42000,code=40000)

The HS2 logs show the following:

org.apache.hadoop.hive.ql.parse.SemanticException: Error retrieving udf
class
        at
org.apache.sentry.binding.hive.HiveAuthzBindingHook.preAnalyze(HiveAuthzBindingHook.java:186)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:457)
        at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:349)
        at
org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:355)
        at
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:110)
        at
org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:82)
        at
org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:129)
        at
org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:209)
        at
org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:154)
        at
org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:208)
        at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1373)
        at
org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1358)
        at
org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
        at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
        at
org.apache.hive.service.cli.thrift.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:38)
        at
org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
        at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
        at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
        at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.ClassNotFoundException:
com.udfs.string.ArrayToStringUDF
        at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
        at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
        at java.security.AccessController.doPrivileged(Native Method)
        at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
        at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
        at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
        at java.lang.Class.forName0(Native Method)
        at java.lang.Class.forName(Class.java:259)
        at
org.apache.sentry.binding.hive.HiveAuthzBindingHook.preAnalyze(HiveAuthzBindingHook.java:175)
        ... 18 more

The sentry file looks like (udfs.jar has the class above). I used the user
site542 to login using beeline. It respects all the table and db priv as
mentioned in the policy file below:

[users]
site542 = group542
[groups]
group542 = rsite542
[roles]
rsite542 =
server=server1->db=rr542->table=*->action=select,server=server1->db=work542,server=server1->uri=hdfs://cluster/tmp/udfs.jar->action=*

I tried pointing to the local jar file instead of hdfs, resuling in the
same error. Let me know if I am missing something

Re: create function issue using sentry

Posted by Prasad Mujumdar <pr...@cloudera.com>.
Hi Sukhendu,

   The uri should point to the jar path on the local filesystem of
HiveServer2, eg
server=server1->uri=file:///data/hive/udf/uds.jar

Note that the uri starts with 'file:///' as shown in the above example.
Please give it a try and let us know if you still run into the problem.

thanks
Prasad


On Tue, Sep 23, 2014 at 10:31 AM, sukhendu chakraborty <
sukhendu.rr@gmail.com> wrote:

> ---------- Forwarded message ----------
> From: sukhendu chakraborty <su...@gmail.com>
> Date: Tue, Sep 23, 2014 at 9:26 AM
> Subject: create function issue using sentry
> To: cdh-user@cloudera.org
>
>
> Hi,
>  We are using the cdh-4.5.0 package distribution and are running into the
> following error while creating a temporary function using beeline:
>
> jdbc:hive2://localhost:10000> create temporary function arraytoString as
> 'com.udfs.string.ArrayToStringUDF';
> Error: Error while processing statement: FAILED: SemanticException Error
> retrieving udf class (state=42000,code=40000)
>
> The HS2 logs show the following:
>
> org.apache.hadoop.hive.ql.parse.SemanticException: Error retrieving udf
> class
>         at
>
> org.apache.sentry.binding.hive.HiveAuthzBindingHook.preAnalyze(HiveAuthzBindingHook.java:186)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:457)
>         at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:349)
>         at
> org.apache.hadoop.hive.ql.Driver.compileAndRespond(Driver.java:355)
>         at
>
> org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:110)
>         at
>
> org.apache.hive.service.cli.operation.SQLOperation.prepare(SQLOperation.java:82)
>         at
>
> org.apache.hive.service.cli.operation.SQLOperation.run(SQLOperation.java:129)
>         at
>
> org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:209)
>         at
>
> org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:154)
>         at
>
> org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:208)
>         at
>
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1373)
>         at
>
> org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1358)
>         at
> org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
>         at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
>         at
>
> org.apache.hive.service.cli.thrift.TSetIpAddressProcessor.process(TSetIpAddressProcessor.java:38)
>         at
>
> org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:244)
>         at
>
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>         at
>
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>         at java.lang.Thread.run(Thread.java:745)
> Caused by: java.lang.ClassNotFoundException:
> com.udfs.string.ArrayToStringUDF
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
>         at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
>         at java.security.AccessController.doPrivileged(Native Method)
>         at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>         at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>         at java.lang.Class.forName0(Native Method)
>         at java.lang.Class.forName(Class.java:259)
>         at
>
> org.apache.sentry.binding.hive.HiveAuthzBindingHook.preAnalyze(HiveAuthzBindingHook.java:175)
>         ... 18 more
>
> The sentry file looks like (udfs.jar has the class above). I used the user
> site542 to login using beeline. It respects all the table and db priv as
> mentioned in the policy file below:
>
> [users]
> site542 = group542
> [groups]
> group542 = rsite542
> [roles]
> rsite542 =
>
> server=server1->db=rr542->table=*->action=select,server=server1->db=work542,server=server1->uri=hdfs://cluster/tmp/udfs.jar->action=*
>
> I tried pointing to the local jar file instead of hdfs, resuling in the
> same error. Let me know if I am missing something
>