You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Wenchen Fan (Jira)" <ji...@apache.org> on 2020/03/31 16:33:00 UTC

[jira] [Resolved] (SPARK-31312) Transforming Hive simple UDF (using JAR) expression may incur CNFE in later evaluation

     [ https://issues.apache.org/jira/browse/SPARK-31312?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Wenchen Fan resolved SPARK-31312.
---------------------------------
    Fix Version/s: 3.0.0
         Assignee: Jungtaek Lim
       Resolution: Fixed

> Transforming Hive simple UDF (using JAR) expression may incur CNFE in later evaluation
> --------------------------------------------------------------------------------------
>
>                 Key: SPARK-31312
>                 URL: https://issues.apache.org/jira/browse/SPARK-31312
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.4.5, 3.0.0, 3.1.0
>            Reporter: Jungtaek Lim
>            Assignee: Jungtaek Lim
>            Priority: Major
>             Fix For: 3.0.0
>
>
> In SPARK-26560, we ensured that Hive UDF using JAR is executed regardless of current thread context classloader.
> [~cloud_fan] pointed out another potential issue in post-review of SPARK-26560 - quoting the comment:
> {quote}
> Found a potential problem: here we call HiveSimpleUDF.dateType (which is a lazy val), to force to load the class with the corrected class loader.
> However, if the expression gets transformed later, which copies HiveSimpleUDF, then calling HiveSimpleUDF.dataType will re-trigger the class loading, and at that time there is no guarantee that the corrected classloader is used.
> I think we should materialize the loaded class in HiveSimpleUDF.
> {quote}
> This JIRA issue is to track the effort of verifying the potential issue and fixing the issue.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org