You are viewing a plain text version of this content. The canonical link for it is here.
Posted to issues@spark.apache.org by "Michael Armbrust (JIRA)" <ji...@apache.org> on 2016/04/20 02:37:25 UTC

[jira] [Resolved] (SPARK-13929) Use Scala reflection for UDFs

     [ https://issues.apache.org/jira/browse/SPARK-13929?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]

Michael Armbrust resolved SPARK-13929.
--------------------------------------
       Resolution: Fixed
    Fix Version/s: 2.0.0

Issue resolved by pull request 12149
[https://github.com/apache/spark/pull/12149]

> Use Scala reflection for UDFs
> -----------------------------
>
>                 Key: SPARK-13929
>                 URL: https://issues.apache.org/jira/browse/SPARK-13929
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>            Reporter: Jakob Odersky
>            Priority: Minor
>             Fix For: 2.0.0
>
>
> {{ScalaReflection}} uses native Java reflection for User Defined Types which would fail if such types are not plain Scala classes that map 1:1 to Java.
> Consider the following extract (from here https://github.com/apache/spark/blob/92024797a4fad594b5314f3f3be5c6be2434de8a/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/ScalaReflection.scala#L376 ):
> {code}
> case t if Utils.classIsLoadable(className) &&
> Utils.classForName(className).isAnnotationPresent(classOf[SQLUserDefinedType]) =>
> val udt = Utils.classForName(className).getAnnotation(classOf[SQLUserDefinedType]).udt().newInstance()
> //...
> {code}
> If {{t}}'s runtime class is actually synthetic (something that doesn't exist in Java and hence uses a dollar sign internally), such as nested classes or package objects, the above code will fail.
> Currently there are no known use-cases of synthetic user-defined types (hence the minor priority), however it would be best practice to remove plain Java reflection and rely on Scala reflection instead.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscribe@spark.apache.org
For additional commands, e-mail: issues-help@spark.apache.org