You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@hive.apache.org by "vengatesh.babu" <ve...@zohocorp.com> on 2014/12/24 15:56:53 UTC

Custom UDF Not Working Properly

Hi,

In Hive 0.14, I have written custom UDF to concat two strings by checking "null" string.
This is my code.


package org.apache.hadoop.hive.ql.udf;


import org.apache.hadoop.hive.ql.exec.UDF;


public class ConcatNullCheck extends UDF {
 public String evaluate(final String s, final String s1) {
     if ((s == null) || (s1 == null) || s.equals("null") || s1.equals("null") ) 
     { 
      return null; 
     }
     return s+s1;
   }
}



Add Jar method:


Above custom udf works fine. when i follow  add jar &amp; Create Function.


Add In source method:



I have added the same udf function in source code. 


Steps Followed:


1. Add ConcatNullCheck.java in  package org.apache.hadoop.hive.ql.udf;
2. Put Entry in FunctionRegistry.java (registerUDF("CONCAT_NULL", ConcatNullCheck.class, false);)
3.   Put Entry in Vectorizer.java (supportedGenericUDFs.add(ConcatNullCheck.class);)


UDF Function works well for queries which does not require map-reduce job.


But, UDF function not working, Throwing NPE while doing Map-reduce.


Error: java.lang.RuntimeException: java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185)
 at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
 at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.exec.MapJoinOperator.generateMapMetaData(MapJoinOperator.java:168)
 at org.apache.hadoop.hive.ql.exec.MapJoinOperator.cleanUpInputFileChangedOp(MapJoinOperator.java:213)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1051)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:486)
 at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176)
 ... 8 more




FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask





Thanks 
Vengatesh Babu K M




Re: Custom UDF Not Working Properly

Posted by Jason Dere <jd...@hortonworks.com>.
The steps you've described sound correct. Do you have a small example (tables, data, query) to demonstrate the problem?

On Dec 25, 2014, at 12:03 AM, vengatesh.babu <ve...@zohocorp.com> wrote:

> Hi,
> 
> Please tell me steps to be followed to add Custom UDF in Hive 0.14 Source Code. 
> 
> Thanks
> Vengatesh Babu
> 
> ---- On Wed, 24 Dec 2014 20:26:53 +0530 vengatesh.babu <ve...@zohocorp.com> wrote ---- 
> 
> Hi,
> 
> In Hive 0.14, I have written custom UDF to concat two strings by checking "null" string.
> This is my code.
> 
> package org.apache.hadoop.hive.ql.udf;
> 
> import org.apache.hadoop.hive.ql.exec.UDF;
> 
> public class ConcatNullCheck extends UDF {
>  public String evaluate(final String s, final String s1) {
>      if ((s == null) || (s1 == null) || s.equals("null") || s1.equals("null") ) 
>      { 
>       return null; 
>      }
>      return s+s1;
>    }
> }
> 
> Add Jar method:
> 
> Above custom udf works fine. when i follow  add jar & Create Function.
> 
> Add In source method:
> 
> I have added the same udf function in source code. 
> 
> Steps Followed:
> 
> 1. Add ConcatNullCheck.java in  package org.apache.hadoop.hive.ql.udf;
> 2. Put Entry in FunctionRegistry.java (registerUDF("CONCAT_NULL", ConcatNullCheck.class, false);)
> 3.   Put Entry in Vectorizer.java (supportedGenericUDFs.add(ConcatNullCheck.class);)
> 
> UDF Function works well for queries which does not require map-reduce job.
> 
> But, UDF function not working, Throwing NPE while doing Map-reduce.
> 
> Error: java.lang.RuntimeException: java.lang.NullPointerException
>  at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185)
>  at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
>  at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
>  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
>  at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
>  at java.security.AccessController.doPrivileged(Native Method)
>  at javax.security.auth.Subject.doAs(Subject.java:415)
>  at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
>  at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
> Caused by: java.lang.NullPointerException
>  at org.apache.hadoop.hive.ql.exec.MapJoinOperator.generateMapMetaData(MapJoinOperator.java:168)
>  at org.apache.hadoop.hive.ql.exec.MapJoinOperator.cleanUpInputFileChangedOp(MapJoinOperator.java:213)
>  at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1051)
>  at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
>  at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
>  at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
>  at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:486)
>  at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176)
>  ... 8 more
> 
> 
> FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask
> 
> 
> Thanks 
> Vengatesh Babu K M
> 
> 
> 


-- 
CONFIDENTIALITY NOTICE
NOTICE: This message is intended for the use of the individual or entity to 
which it is addressed and may contain information that is confidential, 
privileged and exempt from disclosure under applicable law. If the reader 
of this message is not the intended recipient, you are hereby notified that 
any printing, copying, dissemination, distribution, disclosure or 
forwarding of this communication is strictly prohibited. If you have 
received this communication in error, please contact the sender immediately 
and delete it from your system. Thank You.

Re: Custom UDF Not Working Properly

Posted by "vengatesh.babu" <ve...@zohocorp.com>.
Hi,

Please tell me steps to be followed to add Custom UDF in Hive 0.14 Source Code. 


Thanks
Vengatesh Babu

---- On Wed, 24 Dec 2014 20:26:53 +0530 vengatesh.babu &lt;vengatesh.babu@zohocorp.com&gt; wrote ---- 


Hi,

In Hive 0.14, I have written custom UDF to concat two strings by checking "null" string.
This is my code.


package org.apache.hadoop.hive.ql.udf;


import org.apache.hadoop.hive.ql.exec.UDF;


public class ConcatNullCheck extends UDF {
 public String evaluate(final String s, final String s1) {
     if ((s == null) || (s1 == null) || s.equals("null") || s1.equals("null") ) 
     { 
      return null; 
     }
     return s+s1;
   }
}



Add Jar method:


Above custom udf works fine. when i follow  add jar &amp; Create Function.


Add In source method:



I have added the same udf function in source code. 


Steps Followed:


1. Add ConcatNullCheck.java in  package org.apache.hadoop.hive.ql.udf;
2. Put Entry in FunctionRegistry.java (registerUDF("CONCAT_NULL", ConcatNullCheck.class, false);)
3.   Put Entry in Vectorizer.java (supportedGenericUDFs.add(ConcatNullCheck.class);)


UDF Function works well for queries which does not require map-reduce job.


But, UDF function not working, Throwing NPE while doing Map-reduce.


Error: java.lang.RuntimeException: java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:185)
 at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
 at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:430)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:342)
 at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
 at java.security.AccessController.doPrivileged(Native Method)
 at javax.security.auth.Subject.doAs(Subject.java:415)
 at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1614)
 at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)
Caused by: java.lang.NullPointerException
 at org.apache.hadoop.hive.ql.exec.MapJoinOperator.generateMapMetaData(MapJoinOperator.java:168)
 at org.apache.hadoop.hive.ql.exec.MapJoinOperator.cleanUpInputFileChangedOp(MapJoinOperator.java:213)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1051)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.Operator.cleanUpInputFileChanged(Operator.java:1055)
 at org.apache.hadoop.hive.ql.exec.MapOperator.process(MapOperator.java:486)
 at org.apache.hadoop.hive.ql.exec.mr.ExecMapper.map(ExecMapper.java:176)
 ... 8 more




FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask





Thanks 
Vengatesh Babu K M