You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@flink.apache.org by "akisaya (Jira)" <ji...@apache.org> on 2020/12/28 09:10:00 UTC

[jira] [Commented] (FLINK-20606) sql cli with hive catalog cannot create function using user classes from jar which specified by -j option

    [ https://issues.apache.org/jira/browse/FLINK-20606?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17255478#comment-17255478 ] 

akisaya commented on FLINK-20606:
---------------------------------

seems pending for long, sorry to bother [~jark] [~lirui],not sure who to @

> sql cli with hive catalog cannot create function using user classes from jar which specified by  -j option   
> -------------------------------------------------------------------------------------------------------------
>
>                 Key: FLINK-20606
>                 URL: https://issues.apache.org/jira/browse/FLINK-20606
>             Project: Flink
>          Issue Type: Bug
>          Components: Connectors / Hive, Table SQL / API, Table SQL / Client
>    Affects Versions: 1.10.2, 1.12.0, 1.11.2
>            Reporter: akisaya
>            Priority: Major
>              Labels: pull-request-available
>
> with flink version 1.12.0(versions before also affected)
> I started a sql cli  with a hive catalog and specified a user jar file with -j option like this:
> {code:java}
> bin/sql-client.sh embedded -j /Users/akis/Desktop/flink-func/myfunc.jar
> {code}
> {color:#ff0000}when i tried to create a custom function using class from myfunc.jar,cli reported ClassNotFoundException.{color}
>  
> {code:java}
> Flink SQL> use catalog myhive;
> Flink SQL> create function myfunc1 as 'me.aki.flink.flinkudf.MyFunc';
> [ERROR] Could not execute SQL statement. Reason:
> java.lang.ClassNotFoundException: me.aki.flink.flinkudf.MyFunc
> {code}
>  
>  
> me.aki.flink.flinkudf.MyFunc is the identifier of udf,which defined like this
>  
> {code:java}
> package me.aki.flink.flinkudf;
> import org.apache.flink.table.functions.ScalarFunction;
> public class MyFunc extends ScalarFunction {
>     public String eval(String s) {
>         return "myfunc_" + s;
>     }
> }
> {code}
>  
>  
>  
> after walking through the related code, I believe this is a bug caused by wrong classloader
>  
> when using a hive catalog, flink will use  {color:#ff0000}CatalogFunctionImpl{color}  to wrap the function。 The
> isGeneric() methed  uses {color:#ff0000}Class.forName(String clazzName){color} which will use a current classloader(classloader loads flink/lib) to determine the class。
>  
> however with -j option, user jar is set to the ExecutionContext and loaded by another userClassLoader
>  
> and the fix can be easy to pass a classloader to the Class.forName method.
> {code:java}
> ClassLoader cl = Thread.currentThread().getContextClassLoader();
> Class c = Class.forName(className, true, cl);
> {code}
> after do such fix and build a new flink dist,create function behaves right
>  
> {code:java}
> Flink SQL> select myfunc1('1');
> // output
>      EXPR$0
>      myfunc_1
> {code}
>  
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)